gpt4 book ai didi

scala - 线程 "main"中的异常 java.lang.NoSuchMethodError : scala. Predef$.$scope()Lscala/xml/TopScope$;

转载 作者:行者123 更新时间:2023-12-05 00:53:33 27 4
gpt4 key购买 nike

我在 spark 中运行字数统计程序,但出现以下错误
我已添加 scala-xml_2.11-1.0.2.jar

    Using Spark's default log4j profile: org/apache/spark/log4j-defaults.properties
16/12/16 05:14:02 INFO SparkContext: Running Spark version 2.0.2
16/12/16 05:14:03 WARN NativeCodeLoader: Unable to load native-hadoop library for your platform... using builtin-java classes where applicable
16/12/16 05:14:03 WARN Utils: Your hostname, ubuntu resolves to a loopback address: 127.0.1.1; using 192.168.59.132 instead (on interface ens33)
16/12/16 05:14:03 WARN Utils: Set SPARK_LOCAL_IP if you need to bind to another address
16/12/16 05:14:04 INFO SecurityManager: Changing view acls to: hadoopusr
16/12/16 05:14:04 INFO SecurityManager: Changing modify acls to: hadoopusr
16/12/16 05:14:04 INFO SecurityManager: Changing view acls groups to:
16/12/16 05:14:04 INFO SecurityManager: Changing modify acls groups to:
16/12/16 05:14:04 INFO SecurityManager: SecurityManager: authentication disabled; ui acls disabled; users with view permissions: Set(hadoopusr); groups with view permissions: Set(); users with modify permissions: Set(hadoopusr); groups with modify permissions: Set()
16/12/16 05:14:05 INFO Utils: Successfully started service 'sparkDriver' on port 40559.
16/12/16 05:14:05 INFO SparkEnv: Registering MapOutputTracker
16/12/16 05:14:05 INFO SparkEnv: Registering BlockManagerMaster
16/12/16 05:14:05 INFO DiskBlockManager: Created local directory at /tmp/blockmgr-0b830180-ae51-451f-9673-4f98dbaff520
16/12/16 05:14:05 INFO MemoryStore: MemoryStore started with capacity 433.6 MB
16/12/16 05:14:05 INFO SparkEnv: Registering OutputCommitCoordinator
Exception in thread "main" java.lang.NoSuchMethodError: scala.Predef$.$scope()Lscala/xml/TopScope$;
at org.apache.spark.ui.jobs.StagePage.<init>(StagePage.scala:44)
at org.apache.spark.ui.jobs.StagesTab.<init>(StagesTab.scala:34)
at org.apache.spark.ui.SparkUI.<init>(SparkUI.scala:62)
at org.apache.spark.ui.SparkUI$.create(SparkUI.scala:219)
at org.apache.spark.ui.SparkUI$.createLiveUI(SparkUI.scala:161)
at org.apache.spark.SparkContext.<init>(SparkContext.scala:440)
at LearnScala.WordCount$.main(WordCount.scala:15)
at LearnScala.WordCount.main(WordCount.scala)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:498)
at com.intellij.rt.execution.application.AppMain.main(AppMain.java:147)
16/12/16 05:14:05 INFO DiskBlockManager: Shutdown hook called
16/12/16 05:14:05 INFO ShutdownHookManager: Shutdown hook called
16/12/16 05:14:05 INFO ShutdownHookManager: Deleting directory /tmp/spark-789e9a76-894f-468b-a39a-cf00da30e4ba/userFiles-3656d5f8-25ba-45c4-b2f6-9f654a049bb1
16/12/16 05:14:05 INFO ShutdownHookManager: Deleting directory /tmp/spark-789e9a76-894f-468b-a39a-cf00da30e4ba

我正在使用以下版本
build.SBT :
name := "SparkApps"

version := "1.0"

scalaVersion := "2.11.5"

libraryDependencies += "org.apache.spark" % "spark-core_2.10" % "2.0.2"
// https://mvnrepository.com/artifact/org.apache.spark/spark-sql_2.10
libraryDependencies += "org.apache.spark" % "spark-sql_2.10" % "2.0.2"
// https://mvnrepository.com/artifact/org.apache.spark/spark-streaming_2.10
libraryDependencies += "org.apache.spark" % "spark-streaming_2.10" % "2.0.2"
// https://mvnrepository.com/artifact/org.apache.spark/spark-yarn_2.11
libraryDependencies += "org.apache.spark" % "spark-yarn_2.10" % "2.0.2"

星火版本:2.0.2

最佳答案

I am running a word count program in spark but i am getting the below error I have added scala-xml_2.11-1.0.2.jar



稍后我们可以看到:
libraryDependencies += "org.apache.spark" % "spark-core_2.10" % "2.0.2"

选择一个 ;) Scala 2.10 或 Scala 2.11。将 Scala-XML 版本更改为 2.10 或将 Spark 更改为 2.11。从 Spark 2.0 开始,推荐使用 Scala 2.11。

您可以通过在 build.sbt 中添加 %% 轻松添加适当的 Scala 版本:
libraryDependencies += "org.apache.spark" %% "spark-core" % "2.0.2"

其次,在 build.sbt 中没有关于 Scala-XML 依赖的信息——你应该添加它。

最后,您必须通过 --jars 添加所有 3rd 方 jar 到 spark-submit。选项或构建 uber jar - 见 this

关于scala - 线程 "main"中的异常 java.lang.NoSuchMethodError : scala. Predef$.$scope()Lscala/xml/TopScope$;,我们在Stack Overflow上找到一个类似的问题: https://stackoverflow.com/questions/41185855/

27 4 0
Copyright 2021 - 2024 cfsdn All Rights Reserved 蜀ICP备2022000587号
广告合作:1813099741@qq.com 6ren.com