gpt4 book ai didi

java - 找到类 org.apache.hadoop.mapreduce.TaskInputOutputContext,但应为接口(interface)

转载 作者:可可西里 更新时间:2023-11-01 16:16:07 26 4
gpt4 key购买 nike

我正在尝试使用 MRUnit 1.0.0 来测试 Hadoop v2 Reducer,但在尝试时出现异常:

java.lang.IncompatibleClassChangeError: 
Found class org.apache.hadoop.mapreduce.TaskInputOutputContext, but interface was expected
at org.apache.hadoop.mrunit.internal.mapreduce.AbstractMockContextWrapper.createCommon(AbstractMockContextWrapper.java:59)
at org.apache.hadoop.mrunit.internal.mapreduce.MockReduceContextWrapper.create(MockReduceContextWrapper.java:76)
at org.apache.hadoop.mrunit.internal.mapreduce.MockReduceContextWrapper.<init>(MockReduceContextWrapper.java:67)
at org.apache.hadoop.mrunit.mapreduce.ReduceDriver.getContextWrapper(ReduceDriver.java:159)
at org.apache.hadoop.mrunit.mapreduce.ReduceDriver.run(ReduceDriver.java:142)
at org.apache.hadoop.mrunit.TestDriver.runTest(TestDriver.java:574)
at org.apache.hadoop.mrunit.TestDriver.runTest(TestDriver.java:561)

我认为这意味着我在某种程度上不匹配 Hadoop API 的版本,as in this SO question,但我不确定问题出在哪里。我正在使用 Maven 像这样引入依赖项,使用来自 repo.hortonworks.com 的 Hadoop 2.2.0.2.0.6.0-76 和来自 repo1.maven.org 的 MRUnit 1.0.0:

<dependency>
<groupId>org.apache.hadoop</groupId>
<artifactId>hadoop-common</artifactId>
<version>2.2.0.2.0.6.0-76</version>
</dependency>
<dependency>
<groupId>org.apache.hadoop</groupId>
<artifactId>hadoop-hdfs</artifactId>
<version>2.2.0.2.0.6.0-76</version>
</dependency>
<dependency>
<groupId>org.apache.hadoop</groupId>
<artifactId>hadoop-mapreduce-client-core</artifactId>
<version>2.2.0.2.0.6.0-76</version>
</dependency>
<dependency>
<groupId>org.apache.hadoop</groupId>
<artifactId>hadoop-mapreduce-client-common</artifactId>
<version>2.2.0.2.0.6.0-76</version>
</dependency>
<dependency>
<groupId>org.apache.hadoop</groupId>
<artifactId>hadoop-mapreduce-client-jobclient</artifactId>
<version>2.2.0.2.0.6.0-76</version>
</dependency>
<dependency>
<groupId>org.apache.hadoop</groupId>
<artifactId>hadoop-yarn-common</artifactId>
<version>2.2.0.2.0.6.0-76</version>
</dependency>
<dependency>
<groupId>org.apache.mrunit</groupId>
<artifactId>mrunit</artifactId>
<version>1.0.0</version>
<classifier>hadoop2</classifier>
</dependency>

测试用例如下:

@Test
public void testReducer() throws IOException, InterruptedException {
HH.Reduce r = new HH.Reduce();

T1 fx1 = new T1();
T1 fx2 = new T1();

List<T1> values = new ArrayList<T1>();
values.add(fx1);
values.add(fx2);

T1 fxBoth = new T1(fx1.size() + fx2.size());
fxBoth.addValues(fx1);
fxBoth.addValues(fx2);


ReduceDriver<NullWritable, T1, NullWritable, T1> reduceDriver = ReduceDriver.newReduceDriver(r);

reduceDriver.withInput(NullWritable.get(), values);
reduceDriver.withOutput(NullWritable.get(), fxBoth);

// TODO I can't seem to get this test to work.
// Not sure what I'm doing wrong, whether it's a real
// problem or a testing problem.
reduceDriver.runTest();
}

在其他地方,在 HH 包中,Reduce 被定义为一个非常简单的内部类:

public static class Reduce extends Reducer<NullWritable, T1, NullWritable, T1> {
@Override
public void reduce(NullWritable key, Iterable<T1> values, Context context)
throws InterruptedException, IOException {

// Need to create a new record here, because the one we're handed
// may be recycled by our overlords.
T1 out = new T1();
for (T1 t : values) {
out.addValues(t);
}
context.write(key, out);
}
}

看到什么奇怪的东西了吗? MRUnit 是否尝试使用较旧/较新版本的 API?

最佳答案

我认为我遇到了同样的问题,但我使用的是 hadoop-core.1.2.1 和 mrunit-hadoop2-1.1.0 。检查 Maven 依赖项中的版本和分类器(用于测试,而不是那些在 pom.xml 中声明的)。

关于java - 找到类 org.apache.hadoop.mapreduce.TaskInputOutputContext,但应为接口(interface),我们在Stack Overflow上找到一个类似的问题: https://stackoverflow.com/questions/20808590/

26 4 0
Copyright 2021 - 2024 cfsdn All Rights Reserved 蜀ICP备2022000587号
广告合作:1813099741@qq.com 6ren.com