gpt4 book ai didi

apache-spark - PySpark - 系统找不到指定的路径

转载 作者:行者123 更新时间:2023-12-04 05:11:37 25 4
gpt4 key购买 nike

你好,

我已经多次运行 Spark (Spyder IDE)。今天我得到了这个错误(代码是一样的)

from py4j.java_gateway import JavaGateway
gateway = JavaGateway()

os.environ['SPARK_HOME']="C:/Apache/spark-1.6.0"
os.environ['JAVA_HOME']="C:/Program Files/Java/jre1.8.0_71"
sys.path.append("C:/Apache/spark-1.6.0/python/")
os.environ['HADOOP_HOME']="C:/Apache/spark-1.6.0/winutils/"

from pyspark import SparkContext
from pyspark import SparkConf

conf = SparkConf()
The system cannot find the path specified.
Traceback (most recent call last):
File "<stdin>", line 1, in <module>
File "C:\Apache\spark-1.6.0\python\pyspark\conf.py", line 104, in __init__
SparkContext._ensure_initialized()
File "C:\Apache\spark-1.6.0\python\pyspark\context.py", line 245, in _ensure_initialized
SparkContext._gateway = gateway or launch_gateway()
File "C:\Apache\spark-1.6.0\python\pyspark\java_gateway.py", line 94, in launch_gateway
raise Exception("Java gateway process exited before sending the driver its port number")
Exception: Java gateway process exited before sending the driver its port number

出了什么问题?感谢您的宝贵时间。

最佳答案

好的... 有人在 VirtualMachine 中安装了一个新的 java 版本。我只是改变这个

os.environ['JAVA_HOME']="C:/Program Files/Java/jre1.8.0_91" 

然后又开始工作了。谢谢你的时间。

关于apache-spark - PySpark - 系统找不到指定的路径,我们在Stack Overflow上找到一个类似的问题: https://stackoverflow.com/questions/36788384/

25 4 0
Copyright 2021 - 2024 cfsdn All Rights Reserved 蜀ICP备2022000587号
广告合作:1813099741@qq.com 6ren.com