gpt4 book ai didi

pip - pip install pyspark 后运行 pyspark

转载 作者:行者123 更新时间:2023-12-02 03:32:47 25 4
gpt4 key购买 nike

我想在我的家用计算机上安装pyspark。我做到了

pip install pyspark
pip install jupyter

两者似乎都运作良好。

但是当我尝试运行 pyspark 时,我得到了

pyspark
Could not find valid SPARK_HOME while searching ['/home/user', '/home/user/.local/bin']

SPARK_HOME 应设置为什么?

最佳答案

我刚刚遇到了同样的问题,但事实证明pip install pyspark下载的spark distirbution在本地模式下运行良好。 Pip 只是没有设置适当的 SPARK_HOME。但是当我手动设置它时,pyspark 的工作方式就像一个魅力(无需下载任何额外的包)。

$ pip3 install --user pyspark
Collecting pyspark
Downloading pyspark-2.3.0.tar.gz (211.9MB)
100% |████████████████████████████████| 211.9MB 9.4kB/s
Collecting py4j==0.10.6 (from pyspark)
Downloading py4j-0.10.6-py2.py3-none-any.whl (189kB)
100% |████████████████████████████████| 194kB 3.9MB/s
Building wheels for collected packages: pyspark
Running setup.py bdist_wheel for pyspark ... done
Stored in directory: /home/mario/.cache/pip/wheels/4f/39/ba/b4cb0280c568ed31b63dcfa0c6275f2ffe225eeff95ba198d6
Successfully built pyspark
Installing collected packages: py4j, pyspark
Successfully installed py4j-0.10.6 pyspark-2.3.0

$ PYSPARK_PYTHON=python3 SPARK_HOME=~/.local/lib/python3.5/site-packages/pyspark pyspark
Python 3.5.2 (default, Nov 23 2017, 16:37:01)
[GCC 5.4.0 20160609] on linux
Type "help", "copyright", "credits" or "license" for more information.
2018-03-31 14:02:39 WARN NativeCodeLoader:62 - Unable to load native-hadoop library for your platform... using builtin-java classes where applicable
Setting default log level to "WARN".
To adjust logging level use sc.setLogLevel(newLevel). For SparkR, use setLogLevel(newLevel).
Welcome to
____ __
/ __/__ ___ _____/ /__
_\ \/ _ \/ _ `/ __/ '_/
/__ / .__/\_,_/_/ /_/\_\ version 2.3.0
/_/

Using Python version 3.5.2 (default, Nov 23 2017 16:37:01)
>>>

关于pip - pip install pyspark 后运行 pyspark,我们在Stack Overflow上找到一个类似的问题: https://stackoverflow.com/questions/46286436/

25 4 0
Copyright 2021 - 2024 cfsdn All Rights Reserved 蜀ICP备2022000587号
广告合作:1813099741@qq.com 6ren.com