gpt4 book ai didi

scala - 内存不足错误构建 Spark 时出错

转载 作者:行者123 更新时间:2023-12-01 09:37:10 27 4
gpt4 key购买 nike

我正在使用 sbt 构建 Spark 。当我运行以下命令时:

sbt/sbt assembly

建立 Spark 需要一些时间。出现了几个警告,最后我收到以下错误:
[error] java.util.concurrent.ExecutionException: java.lang.OutOfMemoryError: Java heap space
[error] Use 'last' for the full log.

当我使用命令 检查 sbt 版本时sbt sbtVersion ,我得到以下结果:
[warn] Multiple resolvers having different access mechanism configured with same name 'sbt-plugin-releases'. To avoid conflict, Remove duplicate project resolvers (`resolvers`) or rename publishing resolver (`publishTo`).
[warn] There may be incompatibilities among your library dependencies.
[warn] Here are some of the libraries that were evicted:
[warn] * com.typesafe.sbt:sbt-git:0.6.1 -> 0.6.2
[warn] * com.typesafe.sbt:sbt-site:0.7.0 -> 0.7.1
.......
[info] streaming-zeromq/*:sbtVersion
[info] 0.13.7
[info] repl/*:sbtVersion
[info] 0.13.7
[info] spark/*:sbtVersion
[info] 0.13.7

当我发号施令时, ./bin/spark-shell ,我得到以下输出:
ls: cannot access '/home/neel_shah/spark/spark-1.6.1/assembly/target/scala-2.10': No such file or directory
Failed to find Spark assembly in /home/neel_shah/spark/spark-1.6.1/assembly/target/scala-2.10.
You need to build Spark before running this program.

解决办法是什么?

最佳答案

您必须配置 SBT 堆大小:

  • 在 linux 上类型 export SBT_OPTS="-Xmx2G"将其设置为临时
  • 在 linux 上你可以编辑 ~/.bash_profile并添加行 export SBT_OPTS="-Xmx2G"
  • window 类型 set JAVA_OPTS=-Xmx2G将其设置为临时
  • 在 Windows 上,您可以编辑 sbt\conf\sbtconfig.txt并设置 -Xmx2G

  • 更多信息:

    http://www.scala-sbt.org/0.13.1/docs/Getting-Started/Setup.html

    How to set heap size for sbt?

    关于scala - 内存不足错误构建 Spark 时出错,我们在Stack Overflow上找到一个类似的问题: https://stackoverflow.com/questions/37893693/

    27 4 0
    Copyright 2021 - 2024 cfsdn All Rights Reserved 蜀ICP备2022000587号
    广告合作:1813099741@qq.com 6ren.com