gpt4 book ai didi

apache-spark - OOzie Spark : code 101 error

转载 作者:行者123 更新时间:2023-12-01 12:26:41 26 4
gpt4 key购买 nike

我在理解 Oozie 返回给我的错误类型时遇到了一些问题。说明:

我在 Oozie 中创建了一个非常简单的“作业”,XML 是这样的:

<workflow-app name="Massimiliano" xmlns="uri:oozie:workflow:0.5">
<start to="spark-2adf"/>
<kill name="Kill">
<message>Action failed, error message[${wf:errorMessage(wf:lastErrorNode())}]</message>
</kill>
<action name="spark-2adf">
<spark xmlns="uri:oozie:spark-action:0.1">
<job-tracker>${jobTracker}</job-tracker>
<name-node>${nameNode}</name-node>
<master>local[*]</master>
<mode>client</mode>
<name>MySpark</name>
<class>org.XXX.SimpleApp</class>
<jar>${nameNode}/user/${wf:user()}//prova_spark/SimpleApp1.jar</jar>
</spark>
<ok to="End"/>
<error to="Kill"/>
</action>
<end name="End"/>
</workflow-app>

job.properties 如下:

nameNode=hdfs://10.203.17.90:8020
jobTracker=10.203.17.90:8021
master=local[*]
queueName=default
oozie.use.system.libpath=true
oozie.wf.application.path=${nameNode}/user/${user.name}/hdfs/user/oozie/share/lib/lib_20160628182408/spark

我尝试了越来越多的时间来更改所有参数,但完全没有结果。

困扰我的错误是:

Main class [org.apache.oozie.action.hadoop.SparkMain], exit code [101]

名称节点是Master节点;我不知道 oozie.wf.application.path 是否设置正确;

错误的更多细节:

                    hdfs://nameservice1/user/hdfs//prova_spark/SimpleApp1.jar

=================================================================

>>> Invoking Spark class now >>>

Intercepting System.exit(101)

<<< Invocation of Main class completed <<<

Failing Oozie Launcher, Main class [org.apache.oozie.action.hadoop.SparkMain], exit code [101]

Oozie Launcher failed, finishing Hadoop job gracefully

Oozie Launcher, uploading action data to HDFS sequence file: hdfs://nameservice1/user/hdfs/oozie-oozi/0000117-160804173605999-oozie-oozi-W/spark-2adf--spark/action-data.seq

Oozie Launcher ends

路径hdfs://nameservice1/user/hdfs//prova_spark/SimpleApp1.jar是正确的!但我不知道该去哪里解决这个问题。

你能帮帮我吗?

最佳答案

> Step 1. First capture spark and related jars used to execute. One way would be to execute with spark-submit at command line.
> Step 2. Create lib folder if not exists in the workflow path.
> Step 3. Place all the jars collected in step 1 in the lib folders
> Step 4. Run the workflow.

I think this should fix it. However, I would curious to know if it still didn't work.

关于apache-spark - OOzie Spark : code 101 error,我们在Stack Overflow上找到一个类似的问题: https://stackoverflow.com/questions/38746995/

26 4 0
Copyright 2021 - 2024 cfsdn All Rights Reserved 蜀ICP备2022000587号
广告合作:1813099741@qq.com 6ren.com