gpt4 book ai didi

hadoop - hadoop classCastException

转载 作者:行者123 更新时间:2023-12-02 21:57:07 28 4
gpt4 key购买 nike

我使用hadoop 0.18.3遇到以下错误

java.lang.ClassCastException:org.apache.hadoop.io.Text无法转换为org.apache.hadoop.io.DoubleWritable

我将映射器定义为:

public class HadoopMapper extends MapReduceBase implements Mapper<Text,DoubleWritable,Text,DoubleWritable> {
// The Karmasphere Studio Workflow Log displays logging from Apache Commons Logging, for example:
// private static final Log LOG = LogFactory.getLog("HadoopMapper");

@Override
public void map(Text key, DoubleWritable value, OutputCollector<Text, DoubleWritable> output, Reporter reporter)
throws IOException {
// throw new UnsupportedOperationException("Not supported yet.");
Random generator = new Random();
int i;

final int iter = 100000;

for (i =0; i < iter; i++)
{
double x = generator.nextDouble();
double y = generator.nextDouble();

double z;

z = x*x + y*y;

if (z <= 1){
output.collect(new Text("VALUE"), new DoubleWritable(1));
}else{
output.collect(new Text ("VALUE"), new DoubleWritable(0));
}
}


}
}

和 reducer 类为
public class HadoopReducer extends MapReduceBase implements    Reducer<Text,DoubleWritable,Text,DoubleWritable> {
// The Karmasphere Studio Workflow Log displays logging from Apache Commons Logging, for example:
// private static final Log LOG = LogFactory.getLog("HadoopReducer");

@Override
public void reduce(Text key, Iterator<DoubleWritable> value, OutputCollector<Text, DoubleWritable> output, Reporter reporter)
throws IOException {
// TODO code reducer logic here
// throw new UnsupportedOperationException("Not supported yet.");

double pi = 0;
double inside = 0;
double outside = 0;

while (value.hasNext())
{
if (value.next().get() == (long)1)
inside++;
else
outside++;
}

pi = (4*inside)/(inside + outside);

output.collect(new Text ("pi"), new DoubleWritable(pi));
}
}

我将jobconf设置为:
    public static void initJobConf(JobConf conf) {
// Generating code using Karmasphere Protocol for Hadoop 0.18
// CG_GLOBAL

// CG_INPUT_HIDDEN
conf.setInputFormat(KeyValueTextInputFormat.class);
// CG_MAPPER_HIDDEN
conf.setMapperClass(HadoopMapper.class);

// CG_MAPPER

// CG_PARTITIONER_HIDDEN
conf.setPartitionerClass(org.apache.hadoop.mapred.lib.HashPartitioner.class);

// CG_PARTITIONER

// CG_COMPARATOR_HIDDEN
conf.setOutputKeyComparatorClass(org.apache.hadoop.io.Text.Comparator.class);

// CG_COMPARATOR

// CG_COMBINER_HIDDEN

// CG_REDUCER_HIDDEN
conf.setReducerClass(HadoopReducer.class);

// CG_REDUCER
conf.setNumReduceTasks(1);

// CG_OUTPUT_HIDDEN
conf.setOutputKeyClass(Text.class);
conf.setOutputValueClass(DoubleWritable.class);
// CG_OUTPUT

// Others
}

我在conf.setInputFormat(KeyValueTextInputFormat.class)中找不到与KeyValueTextInputFormat.class匹配的Inputformat,那么如何处理呢?我可以继承,您可以帮我举个例子吗?
谢谢

最佳答案

KeyValueTextInputFormat期望输入一个文本键和一个由SEPARATOR_CHARACTER(默认选项卡)分隔的文本值作为输入。您正在尝试将其转换为DoubleWritable(默认情况下不是可能)。

因此,将您的映射器修改为:
映射器

以及相应的map方法,然后将文本自己转换为两倍。

关于hadoop - hadoop classCastException,我们在Stack Overflow上找到一个类似的问题: https://stackoverflow.com/questions/11379556/

28 4 0
Copyright 2021 - 2024 cfsdn All Rights Reserved 蜀ICP备2022000587号
广告合作:1813099741@qq.com 6ren.com