gpt4 book ai didi

apache-spark - 在k8s上执行Spark-错误:缺少应用程序资源

转载 作者:行者123 更新时间:2023-12-02 12:32:32 25 4
gpt4 key购买 nike

我正在尝试在k8s上使用spark运行SparkPi示例。

与...合作

  • kubectl
  • minikube
  • spark-2.4.4-bin-hadoop2.7

  • 运行以下命令:
    spark-submit --master k8s://https://192.168.99.100:8443  --deploy-mode cluster  --name spark-pi  --class org.apache.spark.examples.SparkPi  --conf spark.executor.instances=1  --conf spark.kubernetes.container.image=sparkk8s:latest --conf spark.kubernetes.driver.pod.name=sparkpi  local:///opt/spark/examples/jars/spark-examples_2.11-2.4.4.jar 10

    在Pod日志中引发以下异常:
    + env
    + sed 's/[^=]*=\(.*\)/\1/g'
    + sort -t_ -k4 -n
    + grep SPARK_JAVA_OPT_
    + readarray -t SPARK_EXECUTOR_JAVA_OPTS
    + '[' -n '' ']'
    + '[' -n '' ']'
    + PYSPARK_ARGS=
    + '[' -n '' ']'
    + R_ARGS=
    + '[' -n '' ']'
    + '[' '' == 2 ']'
    + '[' '' == 3 ']'
    + case "$SPARK_K8S_CMD" in
    + CMD=("$SPARK_HOME/bin/spark-submit" --conf
    "spark.driver.bindAddress=$SPARK_DRIVER_BIND_ADDRESS" --deploy-mode client "$@")
    + exec /sbin/tini -s -- /opt/spark/bin/spark-submit --conf spark.driver.bindAddress=172.17.0.6 --deploy-mode client
    Error: Missing application resource.
    Usage: spark-submit [options] <app jar | python file | R file> [app arguments]
    Usage: spark-submit --kill [submission ID] --master [spark://...]
    Usage: spark-submit --status [submission ID] --master [spark://...]
    Usage: spark-submit run-example [options] example-class [example args]

    最初,我认为参数没有传递,因为exec命令没有显示驱动程序类或jar可执行文件的路径。但是kubectl描述显示如下:
    name:               sparkpi
    Namespace: default
    Priority: 0
    PriorityClassName: <none>
    Node: minikube/10.0.2.15
    Start Time: Sun, 15 Sep 2019 13:14:37 +0300
    Labels: spark-app-selector=spark-7c0293be51924505b91e381df8de2b4f
    spark-role=driver
    Annotations: spark-app-name: spark-pi
    Status: Failed
    IP: 172.17.0.5
    Containers:
    spark-kubernetes-driver:
    Container ID: docker://db03f9a45df283848dc3e10c5d3171454b0d47ae25192e54f266e44f58eb7bc8
    Image: spark2:latest
    Image ID: docker://sha256:1d574a61cb26558ec38376d045bdf39fa18168d96486b2f921ea57d3d4fb2b48
    Port: <none>
    Host Port: <none>
    Args:
    driver
    State: Terminated
    Reason: Error
    Exit Code: 1
    Started: Sun, 15 Sep 2019 13:14:37 +0300
    Finished: Sun, 15 Sep 2019 13:14:38 +0300
    Ready: False
    Restart Count: 0
    Limits:
    memory: 1408Mi
    Requests:
    cpu: 1
    memory: 1Gi
    Environment:
    SPARK_DRIVER_MEMORY: 1g
    SPARK_DRIVER_CLASS: org.apache.spark.examples.SparkPi
    SPARK_DRIVER_ARGS: 10
    SPARK_DRIVER_BIND_ADDRESS: (v1:status.podIP)
    SPARK_MOUNTED_CLASSPATH: /opt/spark/examples/jars/spark-examples_2.11-2.4.4.jar:/opt/spark/examples/jars/spark-examples_2.11-2.4.4.jar
    SPARK_JAVA_OPT_0: -Dspark.app.name=spark-pi
    SPARK_JAVA_OPT_1: -Dspark.app.id=spark-7c0293be51924505b91e381df8de2b4f
    SPARK_JAVA_OPT_2: -Dspark.submit.deployMode=cluster
    SPARK_JAVA_OPT_3: -Dspark.driver.blockManager.port=7079
    SPARK_JAVA_OPT_4: -Dspark.driver.host=spark-pi-b8556ee3d1c33baf8d9feacc1cae7a9d-driver-svc.default.svc
    SPARK_JAVA_OPT_5: -Dspark.kubernetes.container.image=spark2:latest
    SPARK_JAVA_OPT_6: -Dspark.executor.instances=1
    SPARK_JAVA_OPT_7: -Dspark.jars=/opt/spark/examples/jars/spark-examples_2.11-2.4.4.jar,/opt/spark/examples/jars/spark-examples_2.11-2.4.4.jar
    SPARK_JAVA_OPT_8: -Dspark.kubernetes.executor.podNamePrefix=spark-pi-b8556ee3d1c33baf8d9feacc1cae7a9d
    SPARK_JAVA_OPT_9: -Dspark.kubernetes.driver.pod.name=sparkpi
    SPARK_JAVA_OPT_10: -Dspark.driver.port=7078
    SPARK_JAVA_OPT_11: -Dspark.master=k8s://https://192.168.99.100:8443

    我还尝试使用docker运行镜像,并检查jar文件是否在提供的路径下-/opt/spark/examples/jars/spark-examples_2.11-2.4.4.jar

    有什么建议么?

    最佳答案

    现在,我又回到了这个问题,并且修复过程很烦人,spark-submit应该从spark分发文件夹运行,因此,不要使用spark-submit别名,而是将spark-sumbit作为bin / spark-submit运行。 ..

    bin/spark-submit --master k8s://https://192.168.99.100:8443  --deploy-mode cluster  --name spark-pi  --class org.apache.spark.examples.SparkPi  --conf spark.executor.instances=1  --conf spark.kubernetes.container.image=sparkk8s:latest --conf spark.kubernetes.driver.pod.name=sparkpi  local:///opt/spark/examples/jars/spark-examples_2.11-2.4.4.jar 10

    关于apache-spark - 在k8s上执行Spark-错误:缺少应用程序资源,我们在Stack Overflow上找到一个类似的问题: https://stackoverflow.com/questions/57943913/

    25 4 0
    Copyright 2021 - 2024 cfsdn All Rights Reserved 蜀ICP备2022000587号
    广告合作:1813099741@qq.com 6ren.com