gpt4 book ai didi

scala - 为什么 spark 应用程序失败并显示 java.lang.NoClassDefFoundError : com/sun/jersey/api/client/config/ClientConfig even though the jar exists?

转载 作者:行者123 更新时间:2023-12-04 17:19:27 33 4
gpt4 key购买 nike

我正在使用具有 Spark 2.3.x 的 Hadoop 集群。对于我的用例,我需要 Spark 2.4x,我从互联网上下载它并将其移动到我的服务器并提取到一个新目录中:~/john/spark247ext/spark-2.4.7-bin-hadoop2.7

这是我的 Spark2.4.7 目录的样子:

username@:[~/john/spark247ext/spark-2.4.7-bin-hadoop2.7] {173} $ ls
bin conf data examples jars kubernetes LICENSE licenses NOTICE python R README.md RELEASE sbin yarn

这些是我的 bin 目录的内容。

username@:[~/john/spark247ext/spark-2.4.7-bin-hadoop2.7/bin] {175} $ ls
beeline find-spark-home.cmd pyspark2.cmd spark-class sparkR2.cmd spark-shell.cmd spark-submit
beeline.cmd load-spark-env.cmd pyspark.cmd spark-class2.cmd sparkR.cmd spark-sql spark-submit2.cmd
docker-image-tool.sh load-spark-env.sh run-example spark-class.cmd spark-shell spark-sql2.cmd spark-submit.cmd
find-spark-home pyspark run-example.cmd sparkR spark-shell2.cmd spark-sql.cmd

我正在使用以下 spark spark submit 命令提交我的 spark 代码:

./spark-submit --master yarn --deploy-mode cluster --driver-class-path /home/john/jars/mssql-jdbc-9.2.0.jre8.jar --jars /home/john/jars/spark-bigquery-with-dependencies_2.11-0.19.1.jar,/home/john/jars/mssql-jdbc-9.2.0.jre8.jar --driver-memory 1g --executor-memory 4g --executor-cores 4 --num-executors 4 --class com.loader /home/john/jars/HiveLoader-1.0-SNAPSHOT-jar-with-dependencies.jar somearg1 somearg2 somearg3

作业失败并出现异常 java.lang.ClassNotFoundException:com.sun.jersey.api.client.config.ClientConfig所以我将该 jar 添加到我的 spark-submit 命令中,如下所示。

./spark-submit --master yarn --deploy-mode cluster --driver-class-path /home/john/jars/mssql-jdbc-9.2.0.jre8.jar --jars /home/john/jars/spark-bigquery-with-dependencies_2.11-0.19.1.jar,/home/john/jars/mssql-jdbc-9.2.0.jre8.jar,/home/john/jars/jersey-client-1.19.4.jar --driver-memory 1g --executor-memory 4g --executor-cores 4 --num-executors 4 --class com.loader /home/john/jars/HiveLoader-1.0-SNAPSHOT-jar-with-dependencies.jar somearg1 somearg2 somearg3

我还检查了目录:/john/spark247ext/spark-2.4.7-bin-hadoop2.7/jars 并发现了 jar:jersey-client-x .xx.x.jar 存在于那里。

username@:[~/john/spark247ext/spark-2.4.7-bin-hadoop2.7/jars] {179} $ ls -ltr | grep jersey
-rwxrwxrwx 1 john john 951701 Sep 8 2020 jersey-server-2.22.2.jar
-rwxrwxrwx 1 john john 72733 Sep 8 2020 jersey-media-jaxb-2.22.2.jar
-rwxrwxrwx 1 john john 971310 Sep 8 2020 jersey-guava-2.22.2.jar
-rwxrwxrwx 1 john john 66270 Sep 8 2020 jersey-container-servlet-core-2.22.2.jar
-rwxrwxrwx 1 john john 18098 Sep 8 2020 jersey-container-servlet-2.22.2.jar
-rwxrwxrwx 1 john john 698375 Sep 8 2020 jersey-common-2.22.2.jar
-rwxrwxrwx 1 john john 167421 Sep 8 2020 jersey-client-2.22.2.jar

我还在我的 pom.xml 文件中添加了依赖项:

<dependency>
<groupId>com.sun.jersey</groupId>
<artifactId>jersey-client</artifactId>
<version>1.19.4</version>
</dependency>

即使在我的 spark-submit 命令中提供了 jar 文件并从我的 maven 项目中创建了一个具有所有依赖项的 fat jar 文件之后,我仍然看到异常:

Exception in thread "main" java.lang.NoClassDefFoundError: com/sun/jersey/api/client/config/ClientConfig
at org.apache.hadoop.yarn.client.api.TimelineClient.createTimelineClient(TimelineClient.java:55)
at org.apache.hadoop.yarn.client.api.impl.YarnClientImpl.createTimelineClient(YarnClientImpl.java:181)
at org.apache.hadoop.yarn.client.api.impl.YarnClientImpl.serviceInit(YarnClientImpl.java:168)
at org.apache.hadoop.service.AbstractService.init(AbstractService.java:163)
at org.apache.spark.deploy.yarn.Client.submitApplication(Client.scala:161)
at org.apache.spark.deploy.yarn.Client.run(Client.scala:1135)
at org.apache.spark.deploy.yarn.YarnClusterApplication.start(Client.scala:1530)
at org.apache.spark.deploy.SparkSubmit.org$apache$spark$deploy$SparkSubmit$$runMain(SparkSubmit.scala:845)
at org.apache.spark.deploy.SparkSubmit.doRunMain$1(SparkSubmit.scala:161)
at org.apache.spark.deploy.SparkSubmit.submit(SparkSubmit.scala:184)
at org.apache.spark.deploy.SparkSubmit.doSubmit(SparkSubmit.scala:86)
at org.apache.spark.deploy.SparkSubmit$$anon$2.doSubmit(SparkSubmit.scala:920)
at org.apache.spark.deploy.SparkSubmit$.main(SparkSubmit.scala:929)
at org.apache.spark.deploy.SparkSubmit.main(SparkSubmit.scala)
Caused by: java.lang.ClassNotFoundException: com.sun.jersey.api.client.config.ClientConfig
at java.net.URLClassLoader.findClass(URLClassLoader.java:381)
at java.lang.ClassLoader.loadClass(ClassLoader.java:424)
at sun.misc.Launcher$AppClassLoader.loadClass(Launcher.java:331)
at java.lang.ClassLoader.loadClass(ClassLoader.java:357)

我下载的 spark 是为了我自己的用例,所以我没有更改项目中现有 spark 版本的任何设置,即 Spark 2.3

任何人都可以让我知道我该怎么做才能解决问题,以便代码正常运行吗?

最佳答案

你能在你的 spark-submit 中使用这个属性吗

   --conf "spark.driver.userClassPathFirst=true"

我认为您遇到了 jar 冲突,即从环境中获取同一 jar 的不同版本

关于scala - 为什么 spark 应用程序失败并显示 java.lang.NoClassDefFoundError : com/sun/jersey/api/client/config/ClientConfig even though the jar exists?,我们在Stack Overflow上找到一个类似的问题: https://stackoverflow.com/questions/67126862/

33 4 0
Copyright 2021 - 2024 cfsdn All Rights Reserved 蜀ICP备2022000587号
广告合作:1813099741@qq.com 6ren.com