gpt4 book ai didi

apache-spark - 如何在 Apache livy 中提交 pyspark 作业?

转载 作者:行者123 更新时间:2023-12-02 19:56:09 26 4
gpt4 key购买 nike

spark-submit --packages com.databricks:spark-redshift_2.11:2.0.1 --jars /usr/share/aws/redshift/jdbc/RedshiftJDBC4.jar /home/hadoop/test.py

如何以 Apache livy 格式指定上述 (pyspark) spark-submit 命令?

我尝试了以下方法:
    curl -X POST --data '{"file": "/home/hadoop/test.py", "conf": 
{"com.databricks": "spark-redshift_2.11:2.0.1"}, \
"queue": "my_queue", "name": "Livy Example", "jars" :
"/usr/share/aws/redshift/jdbc/RedshiftJDBC4.jar"}', \
-H "Content-Type: application/json" localhost:8998/batches

Refered the following livy article spark livy rest api

还收到以下错误:
"Unexpected character ('“' (code 8220 / 0x201c)): was expecting double-quote to start field name\n at [Source: (org.eclipse.jetty.server.HttpInputOverHTTP); line: 1, column: 37]

最佳答案

你的命令有误,请使用下面的例子来构造命令。

Spark 提交命令

./bin/spark-submit \
--class org.apache.spark.examples.SparkPi \

--jars a.jar,b.jar \

--pyFiles a.py,b.py \

--files foo.txt,bar.txt \

--archives foo.zip,bar.tar \

--master yarn \

--deploy-mode cluster \

--driver-memory 10G \

--driver-cores 1 \

--executor-memory 20G \

--executor-cores 3 \

--num-executors 50 \

--queue default \

--name test \

--proxy-user foo \

--conf spark.jars.packages=xxx \

/path/to/examples.jar \

1000

Livy REST JSON 协议(protocol)
{
“className”: “org.apache.spark.examples.SparkPi”,

“jars”: [“a.jar”, “b.jar”],

“pyFiles”: [“a.py”, “b.py”],

“files”: [“foo.txt”, “bar.txt”],

“archives”: [“foo.zip”, “bar.tar”],


“driverMemory”: “10G”,

“driverCores”: 1,

“executorCores”: 3,

“executorMemory”: “20G”,

“numExecutors”: 50,

“queue”: “default”,

“name”: “test”,

“proxyUser”: “foo”,

“conf”: {“spark.jars.packages”: “xxx”},

“file”: “hdfs:///path/to/examples.jar”,

“args”: [1000],

}
  • https://community.hortonworks.com/articles/151164/how-to-submit-spark-application-through-livy-rest.html
  • https://dzone.com/articles/quick-start-with-apache-livy

  • ——包裹。使用此命令时将处理所有传递依赖项。

    在 Livy 中,您需要转到解释器设置页面并在 livy 设置下添加新属性 -

    livy.spark.jars.packages



    和值(value)
    com.databricks:spark-redshift_2.11:2.0.1 

    重新启动解释器并重试查询。

    关于apache-spark - 如何在 Apache livy 中提交 pyspark 作业?,我们在Stack Overflow上找到一个类似的问题: https://stackoverflow.com/questions/51306669/

    26 4 0
    Copyright 2021 - 2024 cfsdn All Rights Reserved 蜀ICP备2022000587号
    广告合作:1813099741@qq.com 6ren.com