gpt4 book ai didi

java - Hadoop 错误 .ClassCastException : org. apache.hadoop.io.LongWritable 无法转换为 org.apache.hadoop.io.Text

转载 作者:可可西里 更新时间:2023-11-01 14:28:54 26 4
gpt4 key购买 nike

我的程序如下:

    public static class MapClass extends Mapper<Text, Text, Text, LongWritable> {

public void map(Text key, Text value, Context context) throws IOException, InterruptedException {
// your map code goes here
String[] fields = value.toString().split(",");

for(String str : fields) {
context.write(new Text(str), new LongWritable(1L));
}
}
}
public int run(String args[]) throws Exception {
Job job = new Job();
job.setJarByClass(TopOS.class);

job.setMapperClass(MapClass.class);

FileInputFormat.setInputPaths(job, new Path(args[0]));
FileOutputFormat.setOutputPath(job, new Path(args[1]));

job.setJobName("TopOS");
job.setMapOutputKeyClass(Text.class);
job.setMapOutputValueClass(LongWritable.class);
job.setNumReduceTasks(0);
boolean success = job.waitForCompletion(true);
return success ? 0 : 1;
}

public static void main(String args[]) throws Exception {
int ret = ToolRunner.run(new TopOS(), args);
System.exit(ret);
}
}

我的数据如下:

123456,Windows,6.1,6394829384232,343534353,23432,23434343,12322
123456,OSX,10,6394829384232,23354353,23432,23434343,63635
123456,Windows,6.0,5396459384232,343534353,23432,23434343,23635
123456,Windows,6.0,6393459384232,343534353,23432,23434343,33635

为什么会出现以下错误?我该如何解决这个问题?

Hadoop : java.lang.ClassCastException: org.apache.hadoop.io.LongWritable cannot be cast to org.apache.hadoop.io.Text

最佳答案

从我的角度来看,您的代码中只有一个小错误。

当您使用平面文本文件作为输入时,固定键类是 LongWritable(您不需要/不使用),值类是 Text。

将您的 Mapper 中的 keyClass 设置为 Object 以强调您不使用它,您就摆脱了错误。

这是我稍微修改过的代码。

package org.woopi.stackoverflow.q22853574;

import org.apache.hadoop.io.*;
import org.apache.hadoop.mapreduce.lib.input.*;
import org.apache.hadoop.mapreduce.lib.output.*;
import org.apache.hadoop.mapreduce.*;
import org.apache.hadoop.fs.Path;
import java.io.IOException;
import org.apache.hadoop.conf.Configuration;

public class MapReduceJob {

public static class MapClass extends Mapper<Object, Text, Text, LongWritable> {

public void map(Object key, Text value, Context context) throws IOException, InterruptedException {
// your map code goes here
String[] fields = value.toString().split(",");

for(String str : fields) {
context.write(new Text(str), new LongWritable(1L));
}
}
}

public int run(String args[]) throws Exception {
Configuration conf = new Configuration();
Job job = Job.getInstance(conf);
job.setJarByClass(MapReduceJob.class);

job.setMapperClass(MapClass.class);

FileInputFormat.setInputPaths(job, new Path(args[0]));
FileOutputFormat.setOutputPath(job, new Path(args[1]));

job.setJobName("MapReduceJob");
job.setOutputKeyClass(Text.class);
job.setOutputValueClass(LongWritable.class);
job.setNumReduceTasks(0);
job.setInputFormatClass(TextInputFormat.class);
boolean success = job.waitForCompletion(true);
return success ? 0 : 1;
}

public static void main(String args[]) throws Exception {
MapReduceJob j = new MapReduceJob();
int ret = j.run(args);
System.exit(ret);
}

希望对您有所帮助。

马丁

关于java - Hadoop 错误 .ClassCastException : org. apache.hadoop.io.LongWritable 无法转换为 org.apache.hadoop.io.Text,我们在Stack Overflow上找到一个类似的问题: https://stackoverflow.com/questions/22853574/

26 4 0
Copyright 2021 - 2024 cfsdn All Rights Reserved 蜀ICP备2022000587号
广告合作:1813099741@qq.com 6ren.com