gpt4 book ai didi

apache-spark - 通过 spark-submit 将 JAR 提交到 Spark 时出现 ClassNotFoundException

转载 作者:行者123 更新时间:2023-12-01 03:23:49 29 4
gpt4 key购买 nike

我正在努力使用 spark-submit 向 Apache Spark 提交 JAR .

为了让事情更容易,我已经尝试使用这个 blog post .代码是

import org.apache.spark.SparkContext
import org.apache.spark.SparkConf

object SimpleScalaSpark {
def main(args: Array[String]) {
val logFile = "/Users/toddmcgrath/Development/spark-1.6.1-bin-hadoop2.4/README.md" // I've replaced this with the path to an existing file
val conf = new SparkConf().setAppName("Simple Application").setMaster("local[*]")
val sc = new SparkContext(conf)
val logData = sc.textFile(logFile, 2).cache()
val numAs = logData.filter(line => line.contains("a")).count()
val numBs = logData.filter(line => line.contains("b")).count()
println("Lines with a: %s, Lines with b: %s".format(numAs, numBs))
}
}

我正在使用 Intellij Idea 2017.1 构建它并在 Spark 2.1.0 上运行。当我在 IDE 中运行它时,一切都运行良好。

然后我将它构建为 JAR 并尝试使用 spark-submit如下
./spark-submit --class SimpleScalaSpark --master local[*] ~/Documents/Spark/Scala/supersimple/out/artifacts/supersimple_jar/supersimple.jar

这导致以下错误
java.lang.ClassNotFoundException: SimpleScalaSpark
at java.net.URLClassLoader.findClass(URLClassLoader.java:381)
at java.lang.ClassLoader.loadClass(ClassLoader.java:424)
at java.lang.ClassLoader.loadClass(ClassLoader.java:357)
at java.lang.Class.forName0(Native Method)
at java.lang.Class.forName(Class.java:348)
at org.apache.spark.util.Utils$.classForName(Utils.scala:229)
at org.apache.spark.deploy.SparkSubmit$.org$apache$spark$deploy$SparkSubmit$$runMain(SparkSubmit.scala:695)
at org.apache.spark.deploy.SparkSubmit$.doRunMain$1(SparkSubmit.scala:187)
at org.apache.spark.deploy.SparkSubmit$.submit(SparkSubmit.scala:212)
at org.apache.spark.deploy.SparkSubmit$.main(SparkSubmit.scala:126)
at org.apache.spark.deploy.SparkSubmit.main(SparkSubmit.scala)

我不知道我错过了什么......特别是考虑到它在IDE中按预期运行。

最佳答案

根据你上面的描述
, 您没有给出正确的类名,因此无法找到该类。

只需更换 SimpleSparkScala SimpleScalaSpark

尝试运行此命令:

./spark-submit --class SimpleScalaSpark --master local[*] ~/Documents/Spark/Scala/supersimple/out/artifacts/supersimple_jar/supersimple.jar

关于apache-spark - 通过 spark-submit 将 JAR 提交到 Spark 时出现 ClassNotFoundException,我们在Stack Overflow上找到一个类似的问题: https://stackoverflow.com/questions/43161062/

29 4 0
Copyright 2021 - 2024 cfsdn All Rights Reserved 蜀ICP备2022000587号
广告合作:1813099741@qq.com 6ren.com