gpt4 book ai didi

java - SparkLauncher 未启动应用程序

转载 作者:行者123 更新时间:2023-12-02 12:31:00 24 4
gpt4 key购买 nike

我已经编写了一个部署在unix机器上的小类,但我似乎无法弄清楚为什么会出现这个错误。我检查了我的 SPARK_HOME 并添加了所有必需的选项,如下面的类所示。我试图将其编写为监视 Spark 线程最终运行的方式。 Spark-submit 工作得很好,所以我知道环境的设置不是问题。

    package com.james.SparkLauncher2;
import java.io.IOException;
import java.util.HashMap;
import java.util.Map;

import org.apache.log4j.Logger;
import org.apache.spark.launcher.SparkAppHandle;
import org.apache.spark.launcher.SparkLauncher;
public class SparkLauncher2

{
static final Logger LOGGER = Logger.getLogger(SparkLauncher2.class);
public static void main(String[] args) {
try {
LOGGER.info("In main of SparkLauncher2");
Map <String, String> env= new HashMap<>();
env.put( "SPARK_HOME", "/opt/cloudera/parcels/CDH-5.8.0-1.cdh5.8.0.p0.42/lib/spark");
env.put(" SPARK_LIBRARY_PATH", "/opt/cloudera/parcels/CDH-5.8.0-1.cdh5.8.0.p0.42/lib/spark/lib");
System.out.println("Environments setup correctly");


//pass in enviroment variables
SparkAppHandle sparkLauncher= new SparkLauncher(env)
.setAppResource("/home/james/fe.jar")
//This conf file works well with the spark submit so it shouldn't be source of the issue
.setPropertiesFile("/etc/spark/conf/spark-defaults.conf")
.setMainClass("com.james.SparkLauncher2.SparkLauncher2")
.setConf(SparkLauncher.DRIVER_MEMORY, "2g")
.setDeployMode("client")
.setVerbose(true)
.setConf("spark.yarn.keytab ","/home/james/my.keytab")
.setConf("spark.yarn.principal","somestring")
.setConf("spark.app.name ","SparkLauncher2") //add class name for example HbaseTest
.setConf("spark.jars","/home/james/flume-test.jar,/opt/cloudera/parcels/CDH-5.8.0-1.cdh5.8.0.p0.42/bin/test")
//call listener class to see if there is any state change
.startApplication(new MyListener());

sparkLauncher.stop();

//handle.stop();
} catch (IOException e) {
e.printStackTrace();
}
//this exception is what gets thrown
catch(Exception e){
LOGGER.info("General exception");
e.printStackTrace();
}
}
}

我打算这个类主要用于检查状态更改,但没有记录状态更改

        class MyListener implements SparkAppHandle.Listener {
@Override
public void stateChanged(SparkAppHandle handle) {
System.out.println("state changed " + handle.getState());
}

@Override
public void infoChanged(SparkAppHandle handle) {
System.out.println("info changed " + handle.getState());
}
}

这是我检查目录的异常,所有目录似乎都是正确的。我什至编写了一个替代版本,其中所有内容都硬编码到 setConf 方法中。显然没有启动 Spark 作业。我在用户界面上也没有看到任何职位。 CommandBuilder 类文档并不清楚如何抛出此异常。就上下文而言,这是 Java 7 和 Spark 1.6

    java.lang.IllegalStateException: Application is still not connected.
at org.apache.spark.launcher.CommandBuilderUtils.checkState(CommandBuilderUtils.java:249)
at org.apache.spark.launcher.ChildProcAppHandle.stop(ChildProcAppHandle.java:74)
at com.james.SparkLauncher2.SparkLauncher2.main(SparkLauncher2.java:43)

最佳答案

感谢您的努力。首先,我使用了错误的身份验证用户,并且缺少 --keyab 和 --principal,因此由于 kerberos 问题没有建立连接。伙计们,请不要忘记配置发生的顺序非常重要!

关于java - SparkLauncher 未启动应用程序,我们在Stack Overflow上找到一个类似的问题: https://stackoverflow.com/questions/45298973/

24 4 0
Copyright 2021 - 2024 cfsdn All Rights Reserved 蜀ICP备2022000587号
广告合作:1813099741@qq.com 6ren.com