gpt4 book ai didi

scala - 使用spark访问HDFS失败

转载 作者:行者123 更新时间:2023-12-01 12:46:05 25 4
gpt4 key购买 nike

我正在使用 Cloudera 4.2.0 和 Spark。

我只是想试试 Spark 给出的一些例子。

// HdfsTest.scala
package spark.examples

import spark._

object HdfsTest {
def main(args: Array[String]) {
val sc = new SparkContext(args(0), "HdfsTest",
System.getenv("SPARK_HOME"), Seq(System.getenv("SPARK_EXAMPLES_JAR")))

val file = sc.textFile("hdfs://n1.example.com/user/cloudera/data/navi_test.csv")
val mapped = file.map(s => s.length).cache()
for (iter <- 1 to 10) {
val start = System.currentTimeMillis()
for (x <- mapped) { x + 2 }
// println("Processing: " + x)
val end = System.currentTimeMillis()
println("Iteration " + iter + " took " + (end-start) + " ms")
}
System.exit(0)
}
}

编译没问题,但是运行时总会出现一些问题:

Exception in thread "main" java.util.ServiceConfigurationError: org.apache.hadoop.fs.FileSystem: Provider org.apache.hadoop.hdfs.HftpFileSystem could not be instantiated: java.lang.IllegalAccessError: tried to access method org.apache.hadoop.fs.DelegationTokenRenewer.<init>(Ljava/lang/Class;)V from class org.apache.hadoop.hdfs.HftpFileSystem
at java.util.ServiceLoader.fail(ServiceLoader.java:224)
at java.util.ServiceLoader.access$100(ServiceLoader.java:181)
at java.util.ServiceLoader$LazyIterator.next(ServiceLoader.java:377)
at java.util.ServiceLoader$1.next(ServiceLoader.java:445)
at org.apache.hadoop.fs.FileSystem.loadFileSystems(FileSystem.java:2229)
at org.apache.hadoop.fs.FileSystem.getFileSystemClass(FileSystem.java:2240)
at org.apache.hadoop.fs.FileSystem.createFileSystem(FileSystem.java:2257)
at org.apache.hadoop.fs.FileSystem.access$200(FileSystem.java:86)
at org.apache.hadoop.fs.FileSystem$Cache.getInternal(FileSystem.java:2296)
at org.apache.hadoop.fs.FileSystem$Cache.get(FileSystem.java:2278)
at org.apache.hadoop.fs.FileSystem.get(FileSystem.java:316)
at org.apache.hadoop.fs.FileSystem.get(FileSystem.java:162)
at org.apache.hadoop.mapred.JobConf.getWorkingDirectory(JobConf.java:587)
at org.apache.hadoop.mapred.FileInputFormat.setInputPaths(FileInputFormat.java:315)
at org.apache.hadoop.mapred.FileInputFormat.setInputPaths(FileInputFormat.java:288)
at spark.SparkContext.hadoopFile(SparkContext.scala:263)
at spark.SparkContext.textFile(SparkContext.scala:235)
at spark.examples.HdfsTest$.main(HdfsTest.scala:9)
at spark.examples.HdfsTest.main(HdfsTest.scala)
Caused by: java.lang.IllegalAccessError: tried to access method org.apache.hadoop.fs.DelegationTokenRenewer.<init>(Ljava/lang/Class;)V from class org.apache.hadoop.hdfs.HftpFileSystem
at org.apache.hadoop.hdfs.HftpFileSystem.<clinit>(HftpFileSystem.java:84)
at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method)
at sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:57)
at sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45)
at java.lang.reflect.Constructor.newInstance(Constructor.java:532)
at java.lang.Class.newInstance0(Class.java:374)
at java.lang.Class.newInstance(Class.java:327)
at java.util.ServiceLoader$LazyIterator.next(ServiceLoader.java:373)
... 16 more

我在 Google 上搜索过,不知道 Spark 和 HDFS 有这种异常。

val file = sc.textFile("hdfs://n1.example.com/user/cloudera/data/navi_test.csv") 就是问题所在。

13/04/04 12:20:51 WARN util.NativeCodeLoader: Unable to load native-hadoop library for your platform... using builtin-java classes where applicable

我收到了这个警告。也许我应该在 CLASS_PATH 中添加一些 hadoop 路径。

请随意提供任何线索。 =)

谢谢大家

任浩

最佳答案

(这个问题也在 spark-users mailing list 上被问过/回答过)。

您需要针对集群上运行的特定版本的 Hadoop/HDFS 编译 Spark。来自Spark documentation :

Spark uses the Hadoop core library to talk to HDFS and other Hadoop-supported storage systems. Because the HDFS protocol has changed in different versions of Hadoop, you must build Spark against the same version that your cluster runs. You can change the version by setting the HADOOP_VERSION variable at the top of project/SparkBuild.scala, then rebuilding Spark (sbt/sbt clean compile).

spark-users邮件列表存档包含几个关于针对特定 Hadoop 版本进行编译的问题,因此如果您在构建 Spark 时遇到任何问题,我会在那里搜索。

关于scala - 使用spark访问HDFS失败,我们在Stack Overflow上找到一个类似的问题: https://stackoverflow.com/questions/15808322/

25 4 0
Copyright 2021 - 2024 cfsdn All Rights Reserved 蜀ICP备2022000587号
广告合作:1813099741@qq.com 6ren.com