gpt4 book ai didi

Hadoop:找不到类

转载 作者:行者123 更新时间:2023-12-02 21:42:58 26 4
gpt4 key购买 nike

我刚刚开始学习 hadoop 并遵循“Hadoop - The Definitive Guide”。

我测试了第一种编写 Map 和 Reduce 类的方法,其中 Mapper 和 Reducer 是接口(interface)。代码运行良好。
然后我开始编写代码,其中 Map 和 Reduce 是具有 Context 类的抽象类。
顺便说一句,我正在使用 hadoop 1.2.1
我看到以下错误

MaxTemperatureReducer.java:5: error: cannot find symbol
public class MaxTemperatureReducer extends Reducer<Text, IntWritable, Text, IntWritable>
^
symbol: class Reducer
MaxTemperatureReducer.java:7: error: cannot find symbol
public void reduce(Text key, Iterable<IntWritable> values, Context context) throws IOException,InterruptedException
^
symbol: class Context
location: class MaxTemperatureReducer
MaxTemperature.java:5: error: cannot find symbol
import org.apache.hadoop.mapreduce.FileInputFormat;
^
symbol: class FileInputFormat
location: package org.apache.hadoop.mapreduce
MaxTemperature.java:6: error: cannot find symbol
import org.apache.hadoop.mapreduce.FileOutputFormat;
^
symbol: class FileOutputFormat
location: package org.apache.hadoop.mapreduce
MaxTemperature.java:7: error: cannot find symbol
import org.apache.hadoop.mapreduce.JobClient;

我的 Mapper 类看起来像
import java.io.IOException;
import org.apache.hadoop.io.IntWritable;
import org.apache.hadoop.io.LongWritable;
import org.apache.hadoop.io.Text;
//import org.apache.hadoop.mapreduce.MapReduceBase;
import org.apache.hadoop.mapreduce.Mapper;
public class MaxTemperatureMapper extends Mapper<LongWritable, Text, Text, IntWritable>
{
private static final int MISSING = 9999;
public void map(LongWritable key, Text value, Context context) throws IOException, InterruptedException
{
String line = value.toString();
String year = line.substring(0, 4);
int airTemperature;
if (line.charAt(4) == '+')
{
airTemperature = Integer.parseInt(line.substring(5, 10));
}
else
{
System.out.println( line );
airTemperature = Integer.parseInt(line.substring(4, 9));
}
System.out.println( "Mapper: " + year + ", " + airTemperature );
context.write(new Text(year), new IntWritable(airTemperature));
}
}

有人可以帮忙吗?

最佳答案

看来问题出在您的 Driver 类中的导入语句中,您试图对 FileInputFormat 使用错误的导入语句和 FileOutputFormat .使用这些导入语句而不是 FileInputFormat & FileOutputFormat在您的驱动程序(MaxTemperature)类中:

import org.apache.hadoop.mapreduce.lib.input.FileInputFormat;
import org.apache.hadoop.mapreduce.lib.output.FileOutputFormat;

使用这个 example作为引用,以确保您的所有导入语句都是有效的。

关于Hadoop:找不到类,我们在Stack Overflow上找到一个类似的问题: https://stackoverflow.com/questions/27203060/

26 4 0
Copyright 2021 - 2024 cfsdn All Rights Reserved 蜀ICP备2022000587号
广告合作:1813099741@qq.com 6ren.com