gpt4 book ai didi

java - 运行 map 缩减程序时出现错误 java.lang.RuntimeException : java. lang.ClassNotFoundException : wordcount_classes. WordCount$Map

转载 作者:可可西里 更新时间:2023-11-01 16:59:30 24 4
gpt4 key购买 nike

我是 Hadoop 的新手,正在尝试运行 Map reduce 程序,即 Word Count,我收到以下错误 java.lang.RuntimeException: java.lang.ClassNotFoundException: wordcount_classes.WordCount$Map and WordCount.java

import java.io.IOException;
import java.util.*;

import org.apache.hadoop.fs.Path;
import org.apache.hadoop.conf.*;
import org.apache.hadoop.io.*;
import org.apache.hadoop.mapreduce.*;
import org.apache.hadoop.mapreduce.lib.input.FileInputFormat;
import org.apache.hadoop.mapreduce.lib.input.TextInputFormat;
import org.apache.hadoop.mapreduce.lib.output.FileOutputFormat;
import org.apache.hadoop.mapreduce.lib.output.TextOutputFormat;

public class WordCount {

public static class Map extends Mapper<LongWritable, Text, Text, IntWritable> {
private final static IntWritable one = new IntWritable(1);
private Text word = new Text();

public void map(LongWritable key, Text value, Context context) throws IOException, InterruptedException {
String line = value.toString();
StringTokenizer tokenizer = new StringTokenizer(line);
while (tokenizer.hasMoreTokens()) {
word.set(tokenizer.nextToken());
context.write(word, one);
}
}
}

public static class Reduce extends Reducer<Text, IntWritable, Text, IntWritable> {

public void reduce(Text key, Iterable<IntWritable> values, Context context)
throws IOException, InterruptedException {
int sum = 0;
for (IntWritable val : values) {
sum += val.get();
}
context.write(key, new IntWritable(sum));
}
}

public static void main(String[] args) throws Exception {
Configuration conf = new Configuration();

Job job = new Job(conf, "wordcount");

job.setOutputKeyClass(Text.class);
job.setOutputValueClass(IntWritable.class);

job.setMapperClass(Map.class);
job.setReducerClass(Reduce.class);

job.setInputFormatClass(TextInputFormat.class);
job.setOutputFormatClass(TextOutputFormat.class);

FileInputFormat.addInputPath(job, new Path(args[0]));
FileOutputFormat.setOutputPath(job, new Path(args[1]));
job.setJarByClass(WordCount.class);
job.waitForCompletion(true);
}

wordcount_classes目录内容是

-rw-r--r--   1 sagar supergroup       1855 2014-10-03 13:15 /user/sagar  /wordcount_classes/WordCount$Map.class
-rw-r--r-- 1 sagar supergroup 1627 2014-10-03 13:15 /user/sagar/wordcount_classes/WordCount$Reduce.class
-rw-r--r-- 1 sagar supergroup 1453 2014-10-03 13:14 /user/sagar/wordcount_classes/WordCount.class
-rw-r--r-- 1 sagar supergroup 3109 2014-10-03 13:15 /user/sagar/wordcount_classes/wordcount.jar

我正在通过以下命令编译程序

hadoop jar wordcount_classes/wordcount.jar wordcount_classes/WordCount input r1

最佳答案

请检查以下内容:

  1. 你编译成可运行的jar了吗
  2. 您是否从 jar 包含的文件夹运行或
  3. 使用以下命令运行

     hadoop jar <path_to_jar>/wordcount.jar WordCount <hdfs_path_to_input>/input <hdfpath>/r1

关于java - 运行 map 缩减程序时出现错误 java.lang.RuntimeException : java. lang.ClassNotFoundException : wordcount_classes. WordCount$Map,我们在Stack Overflow上找到一个类似的问题: https://stackoverflow.com/questions/26176985/

24 4 0
Copyright 2021 - 2024 cfsdn All Rights Reserved 蜀ICP备2022000587号
广告合作:1813099741@qq.com 6ren.com