gpt4 book ai didi

在 Maven 中使用 Spark 时出现 java.lang.NoClassDefFoundError

转载 作者:行者123 更新时间:2023-11-30 12:03:01 25 4
gpt4 key购买 nike

我有一个 Maven 项目,我在其中使用以下 Spark 依赖项:

<dependencies>
<dependency>
<groupId>org.apache.spark</groupId>
<artifactId>spark-core_2.11</artifactId>
<version>${spark.version}</version>
</dependency>
<dependency>
<groupId>org.apache.spark</groupId>
<artifactId>spark-sql_2.11</artifactId>
<version>${spark.version}</version>
</dependency>
<dependency>
<groupId>org.apache.spark</groupId>
<artifactId>spark-mllib_2.11</artifactId>
<version>${spark.version}</version>
</dependency>
<dependency>
<groupId>org.apache.spark</groupId>
<artifactId>spark-graphx_2.11</artifactId>
<version>${spark.version}</version>
</dependency>
<dependency>
<groupId>org.apache.spark</groupId>
<artifactId>spark-yarn_2.11</artifactId>
<version>${spark.version}</version>
</dependency>
<dependency>
<groupId>org.apache.spark</groupId>
<artifactId>spark-network-shuffle_2.11</artifactId>
<version>${spark.version}</version>
</dependency>
<dependency>
<groupId>org.apache.spark</groupId>
<artifactId>spark-streaming-flume_2.11</artifactId>
<version>${spark.version}</version>
</dependency>
<dependency>
<groupId>com.databricks</groupId>
<artifactId>spark-csv_2.11</artifactId>
<version>1.3.0</version>
</dependency>
</dependencies>

spark版本是2.4.4

现在我运行以下代码:

    SparkSession spark = SparkSession.builder()
.master("local[*]")
.config("spark.sql.warehouse.dir", "/tmp/spark")
.appName("SurvivalPredictionMLP")
.getOrCreate();
//Reads the training set
Dataset<Row> df = spark.sqlContext()
.read()
.format("com.databricks.spark.csv")
.option("header", true)
.option("inferSchema", true)
.load("data/train.csv");
//Show
df.show();

但是我在 getOrCreate() 行得到以下异常:

Using Spark's default log4j profile: org/apache/spark/log4j-defaults.properties
19/09/22 14:18:06 INFO SparkContext: Running Spark version 2.4.4
19/09/22 14:18:07 WARN NativeCodeLoader: Unable to load native-hadoop library for your platform... using builtin-java classes where applicable
19/09/22 14:18:07 INFO SparkContext: Submitted application: SurvivalPredictionMLP
19/09/22 14:18:07 INFO SecurityManager: Changing view acls to: pro
19/09/22 14:18:07 INFO SecurityManager: Changing modify acls to: pro
19/09/22 14:18:07 INFO SecurityManager: Changing view acls groups to:
19/09/22 14:18:07 INFO SecurityManager: Changing modify acls groups to:
19/09/22 14:18:07 INFO SecurityManager: SecurityManager: authentication disabled; ui acls disabled; users with view permissions: Set(pro); groups with view permissions: Set(); users with modify permissions: Set(pro); groups with modify permissions: Set()
Exception in thread "main" java.lang.NoClassDefFoundError: io/netty/channel/Channel
at org.apache.spark.rpc.netty.NettyRpcEnv.<init>(NettyRpcEnv.scala:59)
at org.apache.spark.rpc.netty.NettyRpcEnvFactory.create(NettyRpcEnv.scala:461)
at org.apache.spark.rpc.RpcEnv$.create(RpcEnv.scala:57)
at org.apache.spark.SparkEnv$.create(SparkEnv.scala:249)
at org.apache.spark.SparkEnv$.createDriverEnv(SparkEnv.scala:175)
at org.apache.spark.SparkContext.createSparkEnv(SparkContext.scala:257)
at org.apache.spark.SparkContext.<init>(SparkContext.scala:424)
at org.apache.spark.SparkContext$.getOrCreate(SparkContext.scala:2520)
at org.apache.spark.sql.SparkSession$Builder$$anonfun$7.apply(SparkSession.scala:935)
at org.apache.spark.sql.SparkSession$Builder$$anonfun$7.apply(SparkSession.scala:926)
at scala.Option.getOrElse(Option.scala:121)
at org.apache.spark.sql.SparkSession$Builder.getOrCreate(SparkSession.scala:926)
at com.jdlp.projects.titanic.App.<init>(App.java:18)
at com.jdlp.projects.titanic.App.main(App.java:33)
Caused by: java.lang.ClassNotFoundException: io.netty.channel.Channel
at java.net.URLClassLoader$1.run(URLClassLoader.java:371)
at java.net.URLClassLoader$1.run(URLClassLoader.java:363)
at java.security.AccessController.doPrivileged(Native Method)
at java.net.URLClassLoader.findClass(URLClassLoader.java:362)
at java.lang.ClassLoader.loadClass(ClassLoader.java:424)
at sun.misc.Launcher$AppClassLoader.loadClass(Launcher.java:349)
at java.lang.ClassLoader.loadClass(ClassLoader.java:357)
... 14 more
Caused by: java.util.zip.ZipException: invalid LOC header (bad signature)
at java.util.zip.ZipFile.read(Native Method)
at java.util.zip.ZipFile.access$1400(ZipFile.java:60)
at java.util.zip.ZipFile$ZipFileInputStream.read(ZipFile.java:734)
at java.util.zip.ZipFile$ZipFileInflaterInputStream.fill(ZipFile.java:434)
at java.util.zip.InflaterInputStream.read(InflaterInputStream.java:158)
at java.util.jar.Manifest$FastInputStream.fill(Manifest.java:476)
at java.util.jar.Manifest$FastInputStream.readLine(Manifest.java:410)
at java.util.jar.Manifest$FastInputStream.readLine(Manifest.java:444)
at java.util.jar.Attributes.read(Attributes.java:376)
at java.util.jar.Manifest.read(Manifest.java:234)
at java.util.jar.Manifest.<init>(Manifest.java:81)
at java.util.jar.Manifest.<init>(Manifest.java:73)
at java.util.jar.JarFile.getManifestFromReference(JarFile.java:199)
at java.util.jar.JarFile.getManifest(JarFile.java:180)
at sun.misc.URLClassPath$JarLoader$2.getManifest(URLClassPath.java:992)
at java.net.URLClassLoader.defineClass(URLClassLoader.java:451)
at java.net.URLClassLoader.access$100(URLClassLoader.java:74)
at java.net.URLClassLoader$1.run(URLClassLoader.java:369)
... 20 more

当我用谷歌搜索这些异常时,它会建议我更改文件等,但因为我使用 Maven,所以我不能或不应更改任何内容。

有什么办法可以解决这个错误吗?

谢谢!

最佳答案

看起来您在 pom 文件中使用的是 spark 2.11,但您使用 spark 2.4.4 运行程序。当 pom 中的版本与我机器上的版本不匹配时,我看到了奇怪的错误。

关于在 Maven 中使用 Spark 时出现 java.lang.NoClassDefFoundError,我们在Stack Overflow上找到一个类似的问题: https://stackoverflow.com/questions/58049245/

25 4 0
Copyright 2021 - 2024 cfsdn All Rights Reserved 蜀ICP备2022000587号
广告合作:1813099741@qq.com 6ren.com