gpt4 book ai didi

hadoop - Reducer 在 Hadoop Mapreduce 中不工作

转载 作者:可可西里 更新时间:2023-11-01 15:12:39 25 4
gpt4 key购买 nike

您好,我的 Reducer 没有打印出想要的结果,请查看代码。

这是我的 map 功能

public  void map(LongWritable key, Text value, Context context) throws IOException, InterruptedException 
{

String str_line = value.toString();
Detail_output1_column_array = str_line.split("\\"+tabSpace);


Outputkey = Detail_output1_column_array[2];
System.out.println(Outputkey);
context.write(new Text(Outputkey),NullWritable.get());
}
}

public static class ShopFile_Reducer extends Reducer<Text,Iterable<NullWritable>,NullWritable,Text> {


public void reduce(Text Key,Iterable<NullWritable> Values,Context context) throws IOException, InterruptedException {



Key = new Text(Key.toString());
context.write(NullWritable.get(),new Text(Key));

}

}

假设 Detail_output1_column_array[2] 包含010101020102010301

reducer 之后我需要这样的输出010203

但是它正在打印所有东西010101020102010301

这是我的驱动类

Configuration Shopconf = new Configuration();
Shopconf.setStrings("DTGroup_input",DTGroup_input);
Job Shop = new Job(Shopconf,"Export_Column_Mapping");
Shop.setJarByClass(ExportColumnMapping.class);
Shop.setJobName("ShopFile_Job");
Shop.setMapperClass(ShopFile_Mapper.class);
Shop.setReducerClass(ShopFile_Reducer.class);
Shop.setInputFormatClass(TextInputFormat.class);
Shop.setOutputFormatClass(TextOutputFormat.class);
Shop.setMapOutputKeyClass(Text.class);
Shop.setMapOutputValueClass(NullWritable.class);
Shop.setOutputKeyClass(Text.class);
Shop.setOutputValueClass(Text.class);
FileInputFormat.addInputPath(Shop, new Path(outputpath+"/Detailsfile/part*"));
FileOutputFormat.setOutputPath(Shop, new Path(outputpath+"/Shopfile"));
Shop.waitForCompletion(true);

最佳答案

在您的映射器代码中使用此 Outputkey.set( Detail_output1_column_array[2]); 而不是 Outputkey = Detail_output1_column_array[2];

关于hadoop - Reducer 在 Hadoop Mapreduce 中不工作,我们在Stack Overflow上找到一个类似的问题: https://stackoverflow.com/questions/33516571/

25 4 0
Copyright 2021 - 2024 cfsdn All Rights Reserved 蜀ICP备2022000587号
广告合作:1813099741@qq.com 6ren.com