gpt4 book ai didi

python-2.7 - 运行 PySpark 时出错,无法连接到 master

转载 作者:行者123 更新时间:2023-12-04 02:01:12 25 4
gpt4 key购买 nike

嗨,我有以下python代码:

from __future__ import print_function

import sys

from pyspark.sql import SparkSession

from data import Data

if __name__ == "__main__":
if len(sys.argv) != 2:
print("Usage: runner <number_of_executors>", file=sys.stderr)
exit(-1)

spark = SparkSession \
.builder \
.master("spark://138.xxx.xxx.xxx:7077") \
.config("spark.num-executors", sys.argv[1]) \
.config("spark.driver.memory", "1g") \
.config("spark.executor.memory", "1g") \
.config("spark.executor.cores", "4") \
.appName("APP") \
.getOrCreate()

data = Data(spark)

spark.stop()

Data 类将加载各种 csv 文件的位置,但这并不重要。

我在 ~/.bash_profile 中添加了以下几行:
export SPARK_HOME=/home/tsar/spark
export PATH=$SPARK_HOME/bin:$SPARK_HOME/sbin:$SPARK_HOME/conf:$PATH
export PYTHONPATH=$SPARK_HOME/python:$SPARK_HOME/python/build

我还有以下配置文件:
  • 带有节点列表的从属
  • spark-defaults.conf
    spark.master                       spark://138.xxx.xxx.xxx:7077
    spark.driver.memory 1g
    spark.executor.memory 1g
    spark.executor.cores 4
  • spark-env.sh
    export SPARK_MASTER_HOST=138.xxx.xxx.xxx
    export SPARK_MASTER_MEMORY=5g
    export SPARK_WORKER_MEMORY=1024m

  • 接下来发生的是:
  • pyspark --master 138.xxx.xxx.xxx:7077
  • 启动 pyspark 并将其连接到 master
  • spark-submit --num-executors 17 main.py 4
  • 忽略 python 内部的配置,这是意外的,并从 conf 文件中获取参数,除非被命令行选项覆盖,连接到 master 并执行代码
  • python main.py 3
  • 我想使用的选项,无法通过下面的堆栈跟踪连接到主服务器


  •     Using Spark's default log4j profile: org/apache/spark/log4j-defaults.properties
    Setting default log level to "WARN".
    To adjust logging level use sc.setLogLevel(newLevel). For SparkR, use setLogLevel(newLevel).
    17/03/26 19:58:11 WARN NativeCodeLoader: Unable to load native-hadoop library for your platform... using builtin-java classes where applicable
    17/03/26 19:59:12 ERROR StandaloneSchedulerBackend: Application has been killed. Reason: All masters are unresponsive! Giving up.
    17/03/26 19:59:12 WARN StandaloneSchedulerBackend: Application ID is not initialized yet.
    17/03/26 19:59:12 WARN StandaloneAppClient$ClientEndpoint: Drop UnregisterApplication(null) because has not yet connected to master
    17/03/26 19:59:12 ERROR TransportResponseHandler: Still have 3 requests outstanding when connection from /xxx.xxx.xxx.xxx:7077 is closed
    17/03/26 19:59:12 ERROR SparkContext: Error initializing SparkContext.
    java.lang.IllegalArgumentException: requirement failed: Can only call getServletHandlers on a running MetricsSystem
    at scala.Predef$.require(Predef.scala:224)
    at org.apache.spark.metrics.MetricsSystem.getServletHandlers(MetricsSystem.scala:91)
    at org.apache.spark.SparkContext.<init>(SparkContext.scala:524)
    at org.apache.spark.api.java.JavaSparkContext.<init>(JavaSparkContext.scala:58)
    at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method)
    at sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:62)
    at sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45)
    at java.lang.reflect.Constructor.newInstance(Constructor.java:423)
    at py4j.reflection.MethodInvoker.invoke(MethodInvoker.java:247)
    at py4j.reflection.ReflectionEngine.invoke(ReflectionEngine.java:357)
    at py4j.Gateway.invoke(Gateway.java:236)
    at py4j.commands.ConstructorCommand.invokeConstructor(ConstructorCommand.java:80)
    at py4j.commands.ConstructorCommand.execute(ConstructorCommand.java:69)
    at py4j.GatewayConnection.run(GatewayConnection.java:214)
    at java.lang.Thread.run(Thread.java:745)
    Traceback (most recent call last):
    File "main.py", line 21, in <module>
    .appName("CS5052-01 Processing") \
    File "/cs/home/bbt/spark/python/pyspark/sql/session.py", line 169, in getOrCreate
    sc = SparkContext.getOrCreate(sparkConf)
    File "/cs/home/bbt/spark/python/pyspark/context.py", line 307, in getOrCreate
    SparkContext(conf=conf or SparkConf())
    File "/cs/home/bbt/spark/python/pyspark/context.py", line 118, in __init__
    conf, jsc, profiler_cls)
    File "/cs/home/bbt/spark/python/pyspark/context.py", line 179, in _do_init
    self._jsc = jsc or self._initialize_context(self._conf._jconf)
    File "/cs/home/bbt/spark/python/pyspark/context.py", line 246, in _initialize_context
    return self._jvm.JavaSparkContext(jconf)
    File "/cs/home/bbt/spark/python/lib/py4j-0.10.4-src.zip/py4j/java_gateway.py", line 1401, in __call__
    File "/cs/home/bbt/spark/python/lib/py4j-0.10.4-src.zip/py4j/protocol.py", line 319, in get_return_value
    py4j.protocol.Py4JJavaError: An error occurred while calling None.org.apache.spark.api.java.JavaSparkContext.
    : java.lang.IllegalArgumentException: requirement failed: Can only call getServletHandlers on a running MetricsSystem
    at scala.Predef$.require(Predef.scala:224)
    at org.apache.spark.metrics.MetricsSystem.getServletHandlers(MetricsSystem.scala:91)
    at org.apache.spark.SparkContext.<init>(SparkContext.scala:524)
    at org.apache.spark.api.java.JavaSparkContext.<init>(JavaSparkContext.scala:58)
    at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method)
    at sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:62)
    at sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45)
    at java.lang.reflect.Constructor.newInstance(Constructor.java:423)
    at py4j.reflection.MethodInvoker.invoke(MethodInvoker.java:247)
    at py4j.reflection.ReflectionEngine.invoke(ReflectionEngine.java:357)
    at py4j.Gateway.invoke(Gateway.java:236)
    at py4j.commands.ConstructorCommand.invokeConstructor(ConstructorCommand.java:80)
    at py4j.commands.ConstructorCommand.execute(ConstructorCommand.java:69)
    at py4j.GatewayConnection.run(GatewayConnection.java:214)
    at java.lang.Thread.run(Thread.java:745)


    最后一种运行方式与其他方式有何不同? spark master 100% 存在于 IP 地址并且可以访问。

    最佳答案

    我有一个类似的问题,主从肯定可以通信,但是当我去运行一项工作时,我遇到了同样模糊的错误:

    java.lang.IllegalArgumentException: requirement failed: Can only call getServletHandlers on a running MetricsSystem

    对此的正常解释是主从无法通信,但在我的情况下,我在从节点上运行 spark 的用户没有写入日志文件的权限。我通过转到 spark 母版页(默认母版:8080)并从失败的应用程序链接深入到从服务器的工作人员 stderr 输出中发现了这一点。

    是否有关于 stderr 或 spark 日志(默认为/opt/spark/logs/spark-xxx)中错误的更多详细信息?

    关于python-2.7 - 运行 PySpark 时出错,无法连接到 master,我们在Stack Overflow上找到一个类似的问题: https://stackoverflow.com/questions/43033505/

    25 4 0
    Copyright 2021 - 2024 cfsdn All Rights Reserved 蜀ICP备2022000587号
    广告合作:1813099741@qq.com 6ren.com