gpt4 book ai didi

scala - 没有IntelliJ中的错误,无法为Scala创建SparkSession

转载 作者:行者123 更新时间:2023-12-02 10:45:03 27 4
gpt4 key购买 nike

我正在尝试创建一个SparkSession,以便可以使用implicits._,但是在运行一个简单应用程序时出现错误。

我的build.sbt文件看起来像这样:

name := "Reddit-Data-Analyser"

version := "0.1"

scalaVersion := "2.11.12"

fork := true

libraryDependencies += "org.mongodb.scala" %% "mongo-scala-driver" % "2.4.0"

resolvers += "MavenRepository" at "http://central.maven.org/maven2"

libraryDependencies ++= Seq(
"org.apache.spark" %% "spark-core" % "2.3.0",
"org.apache.spark" %% "spark-sql" % "2.3.0"
)

我在 spark-sql上遇到无法解决的依赖项错误,但看来 SparkSession类仍然可以加载。

我的 Main.scala看起来像这样:
import org.apache.spark.sql.SparkSession 

object main extends App {

val spark = SparkSession
.builder()
.config("spark.master", "local")
//.config("spark.network.timeout", "10000s") //Not Relevant
//.config("spark.executor.heartbeatInterval", "5000s") //Not Relevant
.getOrCreate()

println("Hello World")

spark.stop()

}

*编辑:实际上我可以通过使缓存无效并重新启动来使SparkSession运行(尽管我已经做了很多次,所以我不确定是什么改变了),现在当我在SBT控制台中执行 ~run时,我得到了 [error]消息,在此处发布了有关此问题的信息: SparkSession logging to console with [error] logs

以下是我的旧错误消息:
println不执行,相反,我首先获得以下 ERROR输出:
[error] (run-main-7) java.lang.AbstractMethodError
java.lang.AbstractMethodError
at org.apache.spark.internal.Logging$class.initializeLogIfNecessary(Logging.scala:99)
at org.apache.spark.sql.internal.SharedState.initializeLogIfNecessary(SharedState.scala:42)
at org.apache.spark.internal.Logging$class.log(Logging.scala:46)
at org.apache.spark.sql.internal.SharedState.log(SharedState.scala:42)
at org.apache.spark.internal.Logging$class.logInfo(Logging.scala:54)
at org.apache.spark.sql.internal.SharedState.logInfo(SharedState.scala:42)
at org.apache.spark.sql.internal.SharedState.<init>(SharedState.scala:71)
at org.apache.spark.sql.SparkSession$$anonfun$sharedState$1.apply(SparkSession.scala:112)
at org.apache.spark.sql.SparkSession$$anonfun$sharedState$1.apply(SparkSession.scala:112)
at scala.Option.getOrElse(Option.scala:121)
at org.apache.spark.sql.SparkSession.sharedState$lzycompute(SparkSession.scala:112)
at org.apache.spark.sql.SparkSession.sharedState(SparkSession.scala:111)
at org.apache.spark.sql.internal.BaseSessionStateBuilder.build(BaseSessionStateBuilder.scala:284)
at org.apache.spark.sql.SparkSession$.org$apache$spark$sql$SparkSession$$instantiateSessionState(SparkSession.scala:1050)
at org.apache.spark.sql.SparkSession$$anonfun$sessionState$2.apply(SparkSession.scala:130)
at org.apache.spark.sql.SparkSession$$anonfun$sessionState$2.apply(SparkSession.scala:130)
at scala.Option.getOrElse(Option.scala:121)
at org.apache.spark.sql.SparkSession.sessionState$lzycompute(SparkSession.scala:129)
at org.apache.spark.sql.SparkSession.sessionState(SparkSession.scala:126)
at org.apache.spark.sql.SparkSession$Builder$$anonfun$getOrCreate$5.apply(SparkSession.scala:938)
at org.apache.spark.sql.SparkSession$Builder$$anonfun$getOrCreate$5.apply(SparkSession.scala:938)
at scala.collection.mutable.HashMap$$anonfun$foreach$1.apply(HashMap.scala:99)
at scala.collection.mutable.HashMap$$anonfun$foreach$1.apply(HashMap.scala:99)
at scala.collection.mutable.HashTable$class.foreachEntry(HashTable.scala:230)
at scala.collection.mutable.HashMap.foreachEntry(HashMap.scala:40)
at scala.collection.mutable.HashMap.foreach(HashMap.scala:99)
at org.apache.spark.sql.SparkSession$Builder.getOrCreate(SparkSession.scala:938)
at controller.main$.delayedEndpoint$controller$main$1(Main.scala:20)
at controller.main$delayedInit$body.apply(Main.scala:11)
at scala.Function0$class.apply$mcV$sp(Function0.scala:34)
at scala.runtime.AbstractFunction0.apply$mcV$sp(AbstractFunction0.scala:12)
at scala.App$$anonfun$main$1.apply(App.scala:76)
at scala.App$$anonfun$main$1.apply(App.scala:76)
at scala.collection.immutable.List.foreach(List.scala:381)
at scala.collection.generic.TraversableForwarder$class.foreach(TraversableForwarder.scala:35)
at scala.App$class.main(App.scala:76)
at controller.main$.main(Main.scala:11)
at controller.main.main(Main.scala)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:498)
[trace] Stack trace suppressed: run last compile:run for the full output.
java.lang.RuntimeException: Nonzero exit code: 1
at scala.sys.package$.error(package.scala:27)
[trace] Stack trace suppressed: run last compile:run for the full output.
[error] (compile:run) Nonzero exit code: 1
[error] Total time: 9 s, completed Mar 14, 2019 9:43:29 PM
8. Waiting for source changes... (press enter to interrupt)
19/03/14 21:43:29 INFO AsyncEventQueue: Stopping listener queue executorManagement.
java.lang.InterruptedException
at java.util.concurrent.locks.AbstractQueuedSynchronizer$ConditionObject.reportInterruptAfterWait(AbstractQueuedSynchronizer.java:2014)
at java.util.concurrent.locks.AbstractQueuedSynchronizer$ConditionObject.await(AbstractQueuedSynchronizer.java:2048)
at java.util.concurrent.LinkedBlockingQueue.take(LinkedBlockingQueue.java:442)
at org.apache.spark.scheduler.AsyncEventQueue$$anonfun$org$apache$spark$scheduler$AsyncEventQueue$$dispatch$1.apply(AsyncEventQueue.scala:94)
at scala.util.DynamicVariable.withValue(DynamicVariable.scala:58)
at org.apache.spark.scheduler.AsyncEventQueue.org$apache$spark$scheduler$AsyncEventQueue$$dispatch(AsyncEventQueue.scala:83)
at org.apache.spark.scheduler.AsyncEventQueue$$anon$1$$anonfun$run$1.apply$mcV$sp(AsyncEventQueue.scala:79)
at org.apache.spark.util.Utils$.tryOrStopSparkContext(Utils.scala:1319)
at org.apache.spark.scheduler.AsyncEventQueue$$anon$1.run(AsyncEventQueue.scala:78)
19/03/14 21:43:29 INFO AsyncEventQueue: Stopping listener queue appStatus.
java.lang.InterruptedException
at java.util.concurrent.locks.AbstractQueuedSynchronizer$ConditionObject.reportInterruptAfterWait(AbstractQueuedSynchronizer.java:2014)
at java.util.concurrent.locks.AbstractQueuedSynchronizer$ConditionObject.await(AbstractQueuedSynchronizer.java:2048)
at java.util.concurrent.LinkedBlockingQueue.take(LinkedBlockingQueue.java:442)
at org.apache.spark.scheduler.AsyncEventQueue$$anonfun$org$apache$spark$scheduler$AsyncEventQueue$$dispatch$1.apply(AsyncEventQueue.scala:94)
at scala.util.DynamicVariable.withValue(DynamicVariable.scala:58)
at org.apache.spark.scheduler.AsyncEventQueue.org$apache$spark$scheduler$AsyncEventQueue$$dispatch(AsyncEventQueue.scala:83)
at org.apache.spark.scheduler.AsyncEventQueue$$anon$1$$anonfun$run$1.apply$mcV$sp(AsyncEventQueue.scala:79)
at org.apache.spark.util.Utils$.tryOrStopSparkContext(Utils.scala:1319)
at org.apache.spark.scheduler.AsyncEventQueue$$anon$1.run(AsyncEventQueue.scala:78)
19/03/14 21:43:29 ERROR ContextCleaner: Error in cleaning thread
java.lang.InterruptedException
at java.lang.Object.wait(Native Method)
at java.lang.ref.ReferenceQueue.remove(ReferenceQueue.java:143)
at org.apache.spark.ContextCleaner$$anonfun$org$apache$spark$ContextCleaner$$keepCleaning$1.apply$mcV$sp(ContextCleaner.scala:181)
at org.apache.spark.util.Utils$.tryOrStopSparkContext(Utils.scala:1319)
at org.apache.spark.ContextCleaner.org$apache$spark$ContextCleaner$$keepCleaning(ContextCleaner.scala:178)
at org.apache.spark.ContextCleaner$$anon$1.run(ContextCleaner.scala:73)

最佳答案

您是否尝试过类似的方法:

import org.apache.spark.sql.SparkSession 

object main extends App {

val spark = SparkSession
.builder()
.appName("myApp")
.config("master", "local[*]")
.getOrCreate()

println("Hello World")
println(spark.version())

spark.stop()
}

关于scala - 没有IntelliJ中的错误,无法为Scala创建SparkSession,我们在Stack Overflow上找到一个类似的问题: https://stackoverflow.com/questions/55174457/

27 4 0
Copyright 2021 - 2024 cfsdn All Rights Reserved 蜀ICP备2022000587号
广告合作:1813099741@qq.com 6ren.com