gpt4 book ai didi

java - 带有 Spark 的 NoClassDefFoundError

转载 作者:行者123 更新时间:2023-11-30 02:40:31 25 4
gpt4 key购买 nike

我正在尝试使用 IntelliJ 运行一个极其简单的 Spark 上下文实例。但是,我遇到了一个问题,收到 NoClassDefFoundError:

Exception in thread "main" java.lang.NoClassDefFoundError: scala/Product$class
at org.apache.spark.SparkConf$DeprecatedConfig.<init>(SparkConf.scala:609)
at org.apache.spark.SparkConf$.<init>(SparkConf.scala:473)
at org.apache.spark.SparkConf$.<clinit>(SparkConf.scala)
at org.apache.spark.SparkConf.set(SparkConf.scala:71)
at org.apache.spark.SparkConf.setAppName(SparkConf.scala:86)
at test$.main(test.scala:8)
at test.main(test.scala)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:498)
at com.intellij.rt.execution.application.AppMain.main(AppMain.java:147)
Caused by: java.lang.ClassNotFoundException: scala.Product$class
at java.net.URLClassLoader.findClass(URLClassLoader.java:381)
at java.lang.ClassLoader.loadClass(ClassLoader.java:424)
at sun.misc.Launcher$AppClassLoader.loadClass(Launcher.java:331)
at java.lang.ClassLoader.loadClass(ClassLoader.java:357)
... 12 more
Using Spark's default log4j profile: org/apache/spark/log4j-defaults.properties
17/01/27 00:24:37 INFO Utils: Shutdown hook called

我尝试将所有内容从 Sbt 版本转移到 Scala 版本。这是我的配置:

enter image description here

enter image description here

enter image description here

我正在尝试运行此代码:

import org.apache.spark.{SparkConf, SparkContext}

object test {
def main(args: Array[String]): Unit = {
val conf = new SparkConf()
conf.setAppName("test")
conf.setMaster("local[2]")
val sc = new SparkContext(conf)
println(sc)
}
}

关于 Scala 和 Spark 版本不兼容性,还有什么我不知道的吗?

最佳答案

显式添加 Scala 库和 xml 依赖项

    <dependency>
<groupId>org.scala-lang</groupId>
<artifactId>scala-library</artifactId>
<version>2.11.4</version>
</dependency>
<dependency>
<groupId>org.scala-lang</groupId>
<artifactId>scala-xml</artifactId>
<version>2.11.0-M4</version>
</dependency>

关于java - 带有 Spark 的 NoClassDefFoundError,我们在Stack Overflow上找到一个类似的问题: https://stackoverflow.com/questions/41887530/

25 4 0
Copyright 2021 - 2024 cfsdn All Rights Reserved 蜀ICP备2022000587号
广告合作:1813099741@qq.com 6ren.com