gpt4 book ai didi

apache-spark - Spark升级时出现问题: key not found: _PYSPARK_DRIVER_CONN_INFO_PATH

转载 作者:行者123 更新时间:2023-12-01 23:53:06 35 4
gpt4 key购买 nike

由于修复了

,因此下载了最新的 Spark 版本

ERROR AsyncEventQueue:70 - Dropping event from queue appStatus.

设置环境变量并在 PyCharm 中运行相同的代码后,我收到此错误,但找不到解决方案。

Exception in thread "main" java.util.NoSuchElementException: key not found: _PYSPARK_DRIVER_CONN_INFO_PATH
at scala.collection.MapLike$class.default(MapLike.scala:228)
at scala.collection.AbstractMap.default(Map.scala:59)
at scala.collection.MapLike$class.apply(MapLike.scala:141)
at scala.collection.AbstractMap.apply(Map.scala:59)
at org.apache.spark.api.python.PythonGatewayServer$.main(PythonGatewayServer.scala:64)
at org.apache.spark.api.python.PythonGatewayServer.main(PythonGatewayServer.scala)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:498)
at org.apache.spark.deploy.JavaMainApplication.start(SparkApplication.scala:52)
at org.apache.spark.deploy.SparkSubmit$.org$apache$spark$deploy$SparkSubmit$$runMain(SparkSubmit.scala:894)
at org.apache.spark.deploy.SparkSubmit$.doRunMain$1(SparkSubmit.scala:198)
at org.apache.spark.deploy.SparkSubmit$.submit(SparkSubmit.scala:228)
at org.apache.spark.deploy.SparkSubmit$.main(SparkSubmit.scala:137)
at org.apache.spark.deploy.SparkSubmit.main(SparkSubmit.scala)

有什么帮助吗?

最佳答案

我也遇到这个问题了。接下来是我所做的,希望对你有帮助:

1 .找到你的spark版本,我的spark版本是2.4.3;

2 .找到你的pyspark版本,我的pyspark,版本是2.2.0;

3 .重新安装您的 pyspark,与 Spark 的版本相同

pip install pyspark==2.4.3

然后一切就OK了。希望对您有帮助。

关于apache-spark - Spark升级时出现问题: key not found: _PYSPARK_DRIVER_CONN_INFO_PATH,我们在Stack Overflow上找到一个类似的问题: https://stackoverflow.com/questions/50869366/

35 4 0
Copyright 2021 - 2024 cfsdn All Rights Reserved 蜀ICP备2022000587号
广告合作:1813099741@qq.com 6ren.com