gpt4 book ai didi

java - 用MiniDFSCluster对hadoop hdfs写的进行单元测试

转载 作者:可可西里 更新时间:2023-11-01 15:06:44 25 4
gpt4 key购买 nike

我编写了一个写入 hadoop HDFS 的类。我使用的是 1.2.1 版的 hadoop jar。

我想测试这门课。所以基于博客如this one我写了我的代码:

private void createSimulatedHdfs() {
conf = new Configuration();
// 100K blocksize
conf.setLong(DFSConfigKeys.DFS_BLOCK_SIZE_KEY, 1024 * 100);
conf.setLong(DFSConfigKeys.DFS_BLOCK_SIZE_KEY, 100);
conf.setInt(DFSConfigKeys.DFS_BYTES_PER_CHECKSUM_KEY, 1);
conf.setLong(DFSConfigKeys.DFS_HEARTBEAT_INTERVAL_KEY, DFS_REPLICATION_INTERVAL);
conf.setInt(DFSConfigKeys.DFS_NAMENODE_REPLICATION_INTERVAL_KEY, DFS_REPLICATION_INTERVAL);

try {
// simulated HDFS
cluster = new MiniDFSCluster(conf, DATANODE_COUNT, true, null);
cluster.waitActive();
simulatedHdfs = cluster.getFileSystem();
} catch (IOException e) {
Assert.fail("Could not create simulated HDFS " + e.getMessage());
}
}

但是在运行新的 MiniDFSCluster 时遇到异常:

java.lang.AssertionError: Could not create simulated HDFS Cannot run program "du": CreateProcess error=2, The system cannot find the file specified
at org.junit.Assert.fail(Assert.java:88)
at com.taptica.hdfs.writer.HdfsWriterUTest.createSimulatedHdfs(HdfsWriterUTest.java:101)
at com.taptica.hdfs.writer.HdfsWriterUTest.initJunitModeTest(HdfsWriterUTest.java:42)
at com.taptica.hdfs.writer.HdfsWriterUTest.writeTest(HdfsWriterUTest.java:50)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke(Unknown Source)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(Unknown Source)
at java.lang.reflect.Method.invoke(Unknown Source)
at org.junit.runners.model.FrameworkMethod$1.runReflectiveCall(FrameworkMethod.java:47)
at org.junit.internal.runners.model.ReflectiveCallable.run(ReflectiveCallable.java:12)
at org.junit.runners.model.FrameworkMethod.invokeExplosively(FrameworkMethod.java:44)
at org.junit.internal.runners.statements.InvokeMethod.evaluate(InvokeMethod.java:17)
at org.junit.runners.ParentRunner.runLeaf(ParentRunner.java:271)
at org.junit.runners.BlockJUnit4ClassRunner.runChild(BlockJUnit4ClassRunner.java:70)
at org.junit.runners.BlockJUnit4ClassRunner.runChild(BlockJUnit4ClassRunner.java:50)
at org.junit.runners.ParentRunner$3.run(ParentRunner.java:238)
at org.junit.runners.ParentRunner$1.schedule(ParentRunner.java:63)
at org.junit.runners.ParentRunner.runChildren(ParentRunner.java:236)
at org.junit.runners.ParentRunner.access$000(ParentRunner.java:53)
at org.junit.runners.ParentRunner$2.evaluate(ParentRunner.java:229)
at org.junit.runners.ParentRunner.run(ParentRunner.java:309)
at org.eclipse.jdt.internal.junit4.runner.JUnit4TestReference.run(JUnit4TestReference.java:50)
at org.eclipse.jdt.internal.junit.runner.TestExecution.run(TestExecution.java:38)
at org.eclipse.jdt.internal.junit.runner.RemoteTestRunner.runTests(RemoteTestRunner.java:467)
at org.eclipse.jdt.internal.junit.runner.RemoteTestRunner.runTests(RemoteTestRunner.java:683)
at org.eclipse.jdt.internal.junit.runner.RemoteTestRunner.run(RemoteTestRunner.java:390)
at org.eclipse.jdt.internal.junit.runner.RemoteTestRunner.main(RemoteTestRunner.java:197)

我没有在我的本地环境中安装 hadoop(我也不打算这样做)。我怎样才能克服这个问题?

最佳答案

对于 JUnit 测试,无需安装 Hadoop 和第三方实用程序即可使用本地文件系统:

        LocalFileSystem fs = FileSystem.getLocal(new Configuration());

另请查看 MUnit,这是一个对 MR 测试很有帮助的实用程序:http://mrunit.apache.org/

关于java - 用MiniDFSCluster对hadoop hdfs写的进行单元测试,我们在Stack Overflow上找到一个类似的问题: https://stackoverflow.com/questions/19816565/

25 4 0
Copyright 2021 - 2024 cfsdn All Rights Reserved 蜀ICP备2022000587号
广告合作:1813099741@qq.com 6ren.com