gpt4 book ai didi

java - 从 Eclipse IDE 在 YARN 上提交 Spark 应用程序

转载 作者:可可西里 更新时间:2023-11-01 15:11:07 27 4
gpt4 key购买 nike

当我尝试通过 Eclipse 在 Yarn 上提交我的 Spark 应用程序时,我遇到了一个问题。我尝试提交一个简单的 SVM 程序,但出现以下错误。我有macbook,如果有人能给我详细的答案,我将不胜感激

16/09/17 10:04:19 ERROR SparkContext: Error initializing SparkContext.
java.lang.IllegalStateException: Library directory '.../MyProject/assembly/target/scala-2.11/jars' does not exist; make sure Spark is built.
at org.apache.spark.launcher.CommandBuilderUtils.checkState(CommandBuilderUtils.java:248)
at org.apache.spark.launcher.CommandBuilderUtils.findJarsDir(CommandBuilderUtils.java:368)
at org.apache.spark.launcher.YarnCommandBuilderUtils$.findJarsDir(YarnCommandBuilderUtils.scala:38)
at org.apache.spark.deploy.yarn.Client.prepareLocalResources(Client.scala:500)
at org.apache.spark.deploy.yarn.Client.createContainerLaunchContext(Client.scala:834)
at org.apache.spark.deploy.yarn.Client.submitApplication(Client.scala:167)
at org.apache.spark.scheduler.cluster.YarnClientSchedulerBackend.start(YarnClientSchedulerBackend.scala:56)
at org.apache.spark.scheduler.TaskSchedulerImpl.start(TaskSchedulerImpl.scala:149)
at org.apache.spark.SparkContext.<init>(SparkContext.scala:500)
at SVM.main(SVM.java:21)

最佳答案

转到

Run Configurations --> Environment

在 Eclipse 中添加环境变量 SPARK_HOME

关于java - 从 Eclipse IDE 在 YARN 上提交 Spark 应用程序,我们在Stack Overflow上找到一个类似的问题: https://stackoverflow.com/questions/39543266/

27 4 0
Copyright 2021 - 2024 cfsdn All Rights Reserved 蜀ICP备2022000587号
广告合作:1813099741@qq.com 6ren.com