gpt4 book ai didi

java - Hadoop Junit 测试面临编译错误

转载 作者:行者123 更新时间:2023-12-01 13:29:58 24 4
gpt4 key购买 nike

我是 hadoop 新手,我正在遵循 haddop 权威学习指南。我正在使用 MRunit 进行单元测试,但是在测试reduce 任务时我遇到了编译错误。

下面是我的reduce java文件:MaxTemperatureReducer.java

package org.priya.mapred.mapred;

import java.io.IOException;
import java.util.Iterator;
import org.apache.hadoop.io.IntWritable;
import org.apache.hadoop.io.Text;
//import org.apache.hadoop.mapred.MRBench.Reduce;
import org.apache.hadoop.mapreduce.Reducer;

public class MaxTemperatureReducer extends Reducer<Text, IntWritable , Text, IntWritable> {

public void reduce(Text key,Iterator<IntWritable> values, Context context) throws InterruptedException ,IOException
{
int maxValue = Integer.MIN_VALUE;
while(values.hasNext())
{
IntWritable value =values.next();
if(maxValue >= value.get())
{
maxValue= value.get();
}
}

context.write(key, new IntWritable(maxValue));

}

}

下面是我的 Junit 测试文件:MaxTemperatureReducerTest.java

package org.priya.mapred.mapred;

import static org.junit.Assert.*;
import java.util.ArrayList;
import org.junit.Test;
import org.apache.hadoop.io.IntWritable;
import org.apache.hadoop.io.Text;
import org.apache.hadoop.mapreduce.Reducer;
//import org.apache.hadoop.mrunit.ReduceDriver;
import org.apache.hadoop.mrunit.ReduceDriver;
//import org.apache.hadoop.mrunit.mapreduce.MapReduceDriver;

public class MaxTemperatureReducerTest {

@Test
public void reducerTestValid()
{
ArrayList<IntWritable> listOfValues = new ArrayList<IntWritable>();
listOfValues.add(new IntWritable(20));
listOfValues.add(new IntWritable(30));
listOfValues.add(new IntWritable(40));
listOfValues.add(new IntWritable(60));
new ReduceDriver<Text ,IntWritable , Text, IntWritable>()
.withReducer(new MaxTemperatureReducer())
.withInput(new Text("1950"),listOfValues )
.withOutput(new Text("1950"), new IntWritable(60));



}

}

当我使用驱动程序类的 withReducer() 方法将 reduceclass 的实例(即 new MaxTemperatureReducer() )传递给我的 reducerdriver 时。我遇到以下编译错误。

The method withReducer(Reducer<Text,IntWritable,Text,IntWritable>) in the type ReduceDriver<Text,IntWritable,Text,IntWritable> is not applicable for the arguments (MaxTemperatureReducer)

请帮助我,因为我可以看到 MaxTemperatureMapper 类扩展了Reducer 类,并且我无法理解为什么 withReducer() 方法不接受 MaxTemperatureReducer 实例。

谢谢,普里亚兰詹

最佳答案

你的 reducer 必须实现:http://hadoop.apache.org/docs/current2/api/org/apache/hadoop/mapred/Reducer.html

您正在扩展:org.apache.hadoop.mapreduce.Reducer。

关于java - Hadoop Junit 测试面临编译错误,我们在Stack Overflow上找到一个类似的问题: https://stackoverflow.com/questions/21625238/

24 4 0
Copyright 2021 - 2024 cfsdn All Rights Reserved 蜀ICP备2022000587号
广告合作:1813099741@qq.com 6ren.com