gpt4 book ai didi

Scala Spark 版本兼容性

转载 作者:行者123 更新时间:2023-12-02 00:36:13 24 4
gpt4 key购买 nike

我正在尝试在 IntelliJ IDE 中配置 Scala

我机器上的 Scala 和 Spark 版本

Welcome to Scala 2.12.5 (Java HotSpot(TM) 64-Bit Server VM, Java 1.8.0_121).

apache-spark/2.2.1

SBT文件

scalaVersion := "2.12.5"
resolvers += "MavenRepository" at "http://central.maven.org/maven2"

libraryDependencies ++= {
val sparkVersion = "2.2.1"
Seq( "org.apache.spark" %% "spark-core" % sparkVersion)
}

我收到错误

Error:Error while importing SBT project:<br/>...<br/><pre>[info] Resolving jline#jline;2.14.5 ...
[error] (*:ssExtractDependencies) sbt.ResolveException: unresolved dependency: org.apache.spark#spark-core_2.12;2.2.1: not found
[error] unresolved dependency: org.apache.spark#spark-core_2.12;1.4.0: not found
[error] (*:update) sbt.ResolveException: unresolved dependency: org.apache.spark#spark-core_2.12;2.2.1: not found
[error] unresolved dependency: org.apache.spark#spark-core_2.12;1.4.0: not found

最佳答案

没有您在 sbt 项目中定义的 spark 核心版本可供下载。你可以查看maven dependency有关可用版本的更多信息

可以看到spark-core 2.2.1版本,要下载的最新版本是Scala 2.11编译的 info here

所以

或者您将 sbt 构建文件更改为

scalaVersion := "2.11.8"
resolvers += "MavenRepository" at "http://central.maven.org/maven2"

libraryDependencies ++= {
val sparkVersion = "2.2.1"
Seq( "org.apache.spark" %% "spark-core" % sparkVersion)
}

将构建依赖的版本定义为

libraryDependencies ++= {
val sparkVersion = "2.2.1"
Seq("org.apache.spark" % "spark-core_2.11" % sparkVersion)
}

希望回答对你有帮助

关于Scala Spark 版本兼容性,我们在Stack Overflow上找到一个类似的问题: https://stackoverflow.com/questions/49422340/

24 4 0
Copyright 2021 - 2024 cfsdn All Rights Reserved 蜀ICP备2022000587号
广告合作:1813099741@qq.com 6ren.com