gpt4 book ai didi

java - initMiniDFSCluster 抛出NoClassDefFoundError(hadoop客户端测试)

转载 作者:可可西里 更新时间:2023-11-01 14:44:23 26 4
gpt4 key购买 nike

我正在编写一个应该将文件存储在 hadoop-hdfs 中的软件,当然我想为此特定功能编写测试用例。不幸的是,当我尝试构建()MiniDFSCluster 时,我得到以下信息。

16/10/07 16:16:33 INFO hdfs.MiniDFSCluster: starting cluster: numNameNodes=1, numDataNodes=2
16/10/07 16:16:33 INFO hdfs.MiniDFSCluster: Shutting down the Mini HDFS Cluster

java.lang.NoClassDefFoundError: org/apache/hadoop/net/StaticMapping

at org.apache.hadoop.hdfs.MiniDFSCluster.initMiniDFSCluster(MiniDFSCluster.java:792)
at org.apache.hadoop.hdfs.MiniDFSCluster.<init>(MiniDFSCluster.java:475)
at org.apache.hadoop.hdfs.MiniDFSCluster$Builder.build(MiniDFSCluster.java:434)
at de.tuberlin.cit.storageassistant.ArchiveManagerTest.setUp(ArchiveManagerTest.java:33)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:497)
at org.junit.runners.model.FrameworkMethod$1.runReflectiveCall(FrameworkMethod.java:50)
at org.junit.internal.runners.model.ReflectiveCallable.run(ReflectiveCallable.java:12)
at org.junit.runners.model.FrameworkMethod.invokeExplosively(FrameworkMethod.java:47)
at org.junit.internal.runners.statements.RunBefores.evaluate(RunBefores.java:24)
at org.junit.internal.runners.statements.RunAfters.evaluate(RunAfters.java:27)
at org.junit.runners.ParentRunner.runLeaf(ParentRunner.java:325)
at org.junit.runners.BlockJUnit4ClassRunner.runChild(BlockJUnit4ClassRunner.java:78)
at org.junit.runners.BlockJUnit4ClassRunner.runChild(BlockJUnit4ClassRunner.java:57)
at org.junit.runners.ParentRunner$3.run(ParentRunner.java:290)
at org.junit.runners.ParentRunner$1.schedule(ParentRunner.java:71)
at org.junit.runners.ParentRunner.runChildren(ParentRunner.java:288)
at org.junit.runners.ParentRunner.access$000(ParentRunner.java:58)
at org.junit.runners.ParentRunner$2.evaluate(ParentRunner.java:268)
at org.junit.runners.ParentRunner.run(ParentRunner.java:363)
at org.junit.runner.JUnitCore.run(JUnitCore.java:137)
at com.intellij.junit4.JUnit4IdeaTestRunner.startRunnerWithArgs(JUnit4IdeaTestRunner.java:69)
at com.intellij.rt.execution.junit.JUnitStarter.prepareStreamsAndStart(JUnitStarter.java:234)
at com.intellij.rt.execution.junit.JUnitStarter.main(JUnitStarter.java:74)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:497)
at com.intellij.rt.execution.application.AppMain.main(AppMain.java:144)
Caused by: java.lang.ClassNotFoundException: org.apache.hadoop.net.StaticMapping
at java.net.URLClassLoader.findClass(URLClassLoader.java:381)
at java.lang.ClassLoader.loadClass(ClassLoader.java:424)
at sun.misc.Launcher$AppClassLoader.loadClass(Launcher.java:331)
at java.lang.ClassLoader.loadClass(ClassLoader.java:357)
... 31 more


Process finished with exit code 255

这些是我的 hadoop pom.xml 依赖项:

<dependency>
<groupId>org.apache.hadoop</groupId>
<artifactId>hadoop-common</artifactId>
<version>${hadoop.version}</version>
</dependency>

<dependency>
<groupId>org.apache.hadoop</groupId>
<artifactId>hadoop-hdfs</artifactId>
<version>${hadoop.version}</version>
</dependency>

<dependency>
<groupId>org.apache.hadoop</groupId>
<artifactId>hadoop-hdfs</artifactId>
<type>test-jar</type>
<version>${hadoop.version}</version>
<scope>test</scope>
</dependency>

这是否是调试使用 hadoop 的工具的正确方法?如果是,我该如何让它工作。我正在使用经典示例代码进行测试:

@org.junit.Before
public void setUp() throws Exception {
// super.setUp();
Configuration conf = new Configuration();
// conf.set("fs.defaultFS", "hdfs://localhost:9000");
File baseDir = new File("./target/hdfs/").getAbsoluteFile();
FileUtil.fullyDelete(baseDir);
conf.set(MiniDFSCluster.HDFS_MINIDFS_BASEDIR, baseDir.getAbsolutePath());
dfsCluster = new MiniDFSCluster
.Builder(conf)
.checkExitOnShutdown(true)
.numDataNodes(2)
.format(true)
.racks(null)
.build();
hdfsURI = "hdfs://localhost:"+ dfsCluster.getNameNodePort() + "/";
}

@org.junit.After
public void tearDown() throws Exception {
if (dfsCluster != null) {
dfsCluster.shutdown();
}
}

有什么帮助或建议吗?

最佳答案

遇到同样的问题,通过添加对hadoop-common:tests的依赖能够解决它:

<dependency>
<groupId>org.apache.hadoop</groupId>
<artifactId>hadoop-common</artifactId>
<version>${hadoop.version}</version>
<classifier>tests</classifier>
</dependency>

用于生产代码的 hadoop-common Artifact 中缺少适当的类。

关于java - initMiniDFSCluster 抛出NoClassDefFoundError(hadoop客户端测试),我们在Stack Overflow上找到一个类似的问题: https://stackoverflow.com/questions/39919648/

26 4 0
Copyright 2021 - 2024 cfsdn All Rights Reserved 蜀ICP备2022000587号
广告合作:1813099741@qq.com 6ren.com