gpt4 book ai didi

scala - 使用SBT构建Scala/Spark项目时的警告

转载 作者:行者123 更新时间:2023-12-03 08:58:06 27 4
gpt4 key购买 nike

我正在尝试使用以下build.sbt在IntelliJ Idea中构建一个Scala / Spark项目:

name := "try"

version := "1.0"

scalaVersion := "2.11.8"

val sparkVersion = "2.2.0"

resolvers ++= Seq(
"apache-snapshots" at "http://repository.apache.org/snapshots/"
)

libraryDependencies ++= Seq(
"org.apache.spark" %% "spark-core" % sparkVersion,
"org.apache.spark" %% "spark-sql" % sparkVersion,
"org.apache.spark" %% "spark-mllib" % sparkVersion,
"org.apache.spark" %% "spark-streaming" % sparkVersion,
"org.apache.spark" %% "spark-hive" % sparkVersion
)

并得到一堆警告:
8/6/17
1:29 PM SBT project import
[warn] Found version conflict(s) in library dependencies; some are suspected to be binary incompatible:
[warn] * io.netty:netty:3.9.9.Final is selected over {3.6.2.Final, 3.7.0.Final}
[warn] +- org.apache.spark:spark-core_2.11:2.2.0 (depends on 3.9.9.Final)
[warn] +- org.apache.zookeeper:zookeeper:3.4.6 (depends on 3.6.2.Final)
[warn] +- org.apache.hadoop:hadoop-hdfs:2.6.5 (depends on 3.6.2.Final)
[warn] * commons-net:commons-net:2.2 is selected over 3.1
[warn] +- org.apache.spark:spark-core_2.11:2.2.0 (depends on 2.2)
[warn] +- org.apache.hadoop:hadoop-common:2.6.5 (depends on 3.1)
[warn] * com.google.guava:guava:11.0.2 is selected over {12.0.1, 16.0.1}
[warn] +- org.apache.hadoop:hadoop-yarn-client:2.6.5 (depends on 11.0.2)
[warn] +- org.apache.hadoop:hadoop-yarn-api:2.6.5 (depends on 11.0.2)
[warn] +- org.apache.hadoop:hadoop-yarn-common:2.6.5

我有几个问题,也许很愚蠢:
  • 有没有更好的方法来构造build.sbt(例如添加其他解析器?),以便摆脱警告?
  • 我是否应该关心警告?
  • 最佳答案

    Is there a better way to structure build.sbt (add other resolvers e.g.?), so that I can get rid off the warnings?



    一种方法是手动告诉sbt您喜欢哪种依赖关系:
    dependencyOverrides ++= Set(
    "io.netty" % "netty" % "3.9.9.Final",
    "commons-net" % "commons-net" % "2.2",
    "com.google.guava" % "guava" % "11.0.2"
    )

    我还建议阅读有关 conflict management in sbt的信息。

    Should I care about warnings at all?



    在您的情况下-不,因为您的冲突源于仅使用同一版本下发布的与 Spark 相关的 Artifact 。 Spark是一个拥有大量用户群的项目,由于传递依赖而引入jar jar的可能性相当低(尽管从技术上讲不能保证)。

    在一般情况下-也许。通常,在大多数情况下都可以,但是出现问题的可能性很小,可能需要仔细的手动依赖性解决方案(如果可能的话)。在这些情况下,在运行应用程序并遇到诸如类丢失,方法丢失,方法签名不匹配或某些与反射相关的问题之类的问题之前,很难分辨出是否存在问题。

    关于scala - 使用SBT构建Scala/Spark项目时的警告,我们在Stack Overflow上找到一个类似的问题: https://stackoverflow.com/questions/45531198/

    27 4 0
    Copyright 2021 - 2024 cfsdn All Rights Reserved 蜀ICP备2022000587号
    广告合作:1813099741@qq.com 6ren.com