gpt4 book ai didi

macos - 无法运行 pyspark : Failed to find Spark jars directory

转载 作者:可可西里 更新时间:2023-11-01 15:09:57 25 4
gpt4 key购买 nike

我已经下载了 spark-2.1.0-bin-without-hadoop,它位于以下目录中:

 ~/Desktop/ahajib/opt/spark-2.1.0-bin-without-hadoop

当我转到那个目录然后 bin 并尝试运行 pyspark 时,我收到以下错误:

/usr/local/bin/pyspark: line 24: ~/Desktop/ahajib/opt/spark-2.1.0-bin-without-hadoop/bin/load-spark-env.sh: No such file or directory
/Users/ahajibagheri/Desktop/ahajib/opt/spark-2.1.0-bin-without-hadoop/bin/spark-class: line 24: ~/Desktop/ahajib/opt/spark-2.1.0-bin-without-hadoop/bin/load-spark-env.sh: No such file or directory
Failed to find Spark jars directory (~/Desktop/ahajib/opt/spark-2.1.0-bin-without-hadoop/assembly/target/scala-/jars).
You need to build Spark with the target "package" before running this program.

我已经设置了我的 JAVA_HOME 和 SPARK_HOME:

$JAVA_HOME
/Library/Java/JavaVirtualMachines/jdk1.8.0_131.jdk/Contents/Home
echo $SPARK_HOME
~/Desktop/ahajib/opt/spark-2.1.0-bin-without-hadoop

我在 macOS Sierra 10.12.6 上运行一切。非常感谢有关此问题的任何帮助。如果我遗漏了什么,请告诉我,以便我可以相应地更新问题。

谢谢

最佳答案

我遇到了同样的问题。为了解决这个问题,我必须定义 SPARK_HOME 而没有主目录的快捷方式 (~)。我认为你的情况应该是这样的:

export SPARK_HOME="/Users/ahajibagheri/Desktop/ahajib/opt/spark-2.1.0-bin-without-hadoop"

关于macos - 无法运行 pyspark : Failed to find Spark jars directory,我们在Stack Overflow上找到一个类似的问题: https://stackoverflow.com/questions/46077821/

25 4 0
Copyright 2021 - 2024 cfsdn All Rights Reserved 蜀ICP备2022000587号
广告合作:1813099741@qq.com 6ren.com