gpt4 book ai didi

python - PySpark 2.4.5 与 Python 3.8.3 不兼容,我该如何解决?

转载 作者:行者123 更新时间:2023-12-03 23:46:52 24 4
gpt4 key购买 nike

代码

from pyspark import SparkContext,SparkConf

conf=SparkConf().setMaster('local').setAppName('Test App')
sc=SparkContext(conf)

错误信息
    Traceback (most recent call last):
File "C:\Users\Test\PycharmProjects\python-test\MainFile.py", line 5, in <module>
from pyspark import SparkContext,SparkConf
File "C:\Test\Python_3.8.3_Latest\lib\site-packages\pyspark\__init__.py", line 51, in <module>
from pyspark.context import SparkContext
File "C:\Test\Python_3.8.3_Latest\lib\site-packages\pyspark\context.py", line 31, in <module>
from pyspark import accumulators
File "C:\Test\Python_3.8.3_Latest\lib\site-packages\pyspark\accumulators.py", line 97, in <module>
from pyspark.serializers import read_int, PickleSerializer
File "C:\Test\Python_3.8.3_Latest\lib\sit`enter code here`e-packages\pyspark\serializers.py", line 72, in <module>
from pyspark import cloudpickle
File "C:\Test\Python_3.8.3_Latest\lib\site-packages\pyspark\cloudpickle.py", line 145, in <module>
_cell_set_template_code = _make_cell_set_template_code()
File "C:\Test\Python_3.8.3_Latest\lib\site-packages\pyspark\cloudpickle.py", line 126, in _make_cell_set_template_code
return types.CodeType(
TypeError: an integer is required (got type bytes)

最佳答案

尽管最新的 Spark 文档说它支持 Python 2.7+/3.4+ ,它实际上还不支持 Python 3.8。根据 this PR,Spark 3.0 预计将支持 Python 3.8。因此,您可以尝试 Spark 3.0 预览版(假设您不打算进行生产部署),或者“暂时”回退到 Spark 2.4.x 的 Python 3.6/3.7。

关于python - PySpark 2.4.5 与 Python 3.8.3 不兼容,我该如何解决?,我们在Stack Overflow上找到一个类似的问题: https://stackoverflow.com/questions/62208730/

24 4 0
Copyright 2021 - 2024 cfsdn All Rights Reserved 蜀ICP备2022000587号
广告合作:1813099741@qq.com 6ren.com