gpt4 book ai didi

hadoop - hadoop 2.2.0 wordcount 示例中的 "No FileSystem for scheme: hdfs"IOException

转载 作者:可可西里 更新时间:2023-11-01 15:01:04 25 4
gpt4 key购买 nike

我全新安装了 hadoop yarn 并通过 hadoop/share/hadoop/mapreduce/hadoop-mapreduce-examples... 中给定的 jar 文件执行了 wordcount 示例,但是当我尝试编译 wordcount source并运行它,它给了我 java.io.IOException: No FileSystem for scheme: hdfs

上面的异常与这行代码有关:

FileInputFormat.addInputPath(job, new Path(args[0]));

编辑:命令和输出如下:

hduser@master-virtual-machine:~$ hadoop jar Desktop/NativeWordcount.jar /tin /tout
SLF4J: Class path contains multiple SLF4J bindings.
SLF4J: Found binding in [rsrc:org/slf4j/impl/StaticLoggerBinder.class]
SLF4J: Found binding in [jar:rsrc:slf4j-log4j12-1.7.5.jar!/org/slf4j/impl/StaticLoggerBinder.class]
SLF4J: See http://www.slf4j.org/codes.html#multiple_bindings for an explanation.
SLF4J: Actual binding is of type [org.slf4j.impl.Log4jLoggerFactory]
13/12/03 07:14:44 WARN util.NativeCodeLoader: Unable to load native-hadoop library for your platform... using builtin-java classes where applicable
Exception in thread "main" java.lang.reflect.InvocationTargetException
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:606)
at org.eclipse.jdt.internal.jarinjarloader.JarRsrcLoader.main(JarRsrcLoader.java:58)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:606)
at org.apache.hadoop.util.RunJar.main(RunJar.java:212)
Caused by: java.io.IOException: No FileSystem for scheme: hdfs
at org.apache.hadoop.fs.FileSystem.getFileSystemClass(FileSystem.java:2421)
at org.apache.hadoop.fs.FileSystem.createFileSystem(FileSystem.java:2428)
at org.apache.hadoop.fs.FileSystem.access$200(FileSystem.java:88)
at org.apache.hadoop.fs.FileSystem$Cache.getInternal(FileSystem.java:2467)
at org.apache.hadoop.fs.FileSystem$Cache.get(FileSystem.java:2449)
at org.apache.hadoop.fs.FileSystem.get(FileSystem.java:367)
at org.apache.hadoop.fs.FileSystem.get(FileSystem.java:166)
at org.apache.hadoop.fs.FileSystem.get(FileSystem.java:351)
at org.apache.hadoop.fs.Path.getFileSystem(Path.java:287)
at org.apache.hadoop.mapreduce.lib.input.FileInputFormat.addInputPath(FileInputFormat.java:466)
at WordCount.main(WordCount.java:55)
... 10 more

最佳答案

我今天也遇到了这个问题。您需要确保 hadoop-hdfs jar 在你的类路径中。

我第一次刷这个是简单地在我的项目中添加一个依赖于 Maven 中的 hadoop-hdfs 包,但这还不够。最后,我关注了Cloudera's advice并添加了对 hadoop-client 的依赖。 pom.xml 文件的相关子句是:

 <dependency>
<groupId>org.apache.hadoop</groupId>
<artifactId>hadoop-client</artifactId>
<version>VERSION</version>
</dependency>

当我在 Clojure 中使用 Leiningen 执行此操作时,我将其添加到我的 project.clj 文件中:

(defproject 
; ...
:dependencies [[org.apache.hadoop/hadoop-client "VERSION"]
; ...
])

(当然,您的版本将取决于您的系统上安装了什么。目前 2.x 系列中唯一的发行版本是 2.2.0。)

关于hadoop - hadoop 2.2.0 wordcount 示例中的 "No FileSystem for scheme: hdfs"IOException,我们在Stack Overflow上找到一个类似的问题: https://stackoverflow.com/questions/20355176/

25 4 0
Copyright 2021 - 2024 cfsdn All Rights Reserved 蜀ICP备2022000587号
广告合作:1813099741@qq.com 6ren.com