gpt4 book ai didi

java - Spark 启动器。 java.lang.NoSuchMethodError : org. yaml.snakeyaml.Yaml.

转载 作者:行者123 更新时间:2023-12-02 03:24:11 29 4
gpt4 key购买 nike

同学们,大家好。我开发了一个基于sparkLauncher的应用程序,它运行一个可执行jar,其中有5个操作。每个操作取决于特定的变量。我有一个主hadoop集群spark2.3.0-hadoop2.6.5。用在它身上效果很好。我的部分工作代码:

 private void runSparkJob(String pathToJar, final LocalDate startDate, final LocalDate endDate) {
if (executionInProgress.get()) {
LOGGER.warn("Execution already in progress");
return;
}
Process sparkProcess = null;
try {
LOGGER.info("Create SparkLauncher. SparkHome: [{}]. JarPath: [{}].", sparkHome, vmJarPath);
executionInProgress.set(true);
sparkProcess = new SparkLauncher()
.setAppName(activeOperationProfile)
.setSparkHome(sparkHome) //sparkHome folder on main cluster
.setAppResource(pathToJar) // jar with 5 operation
.setConf(SparkLauncher.DRIVER_EXTRA_JAVA_OPTIONS,
String.format("-Drunner.operation-profile=%1$s -Doperation.startDate=%2$s -Doperation.endDate=%3$s", activeOperationProfile, startDate,endDate))
.setConf(SparkLauncher.DRIVER_MEMORY, "12G")
.redirectToLog(LOGGER.getName())
.setMaster("yarn")
.launch();

sparkProcess.waitFor();
int exitCode = sparkProcess.exitValue();
if (exitCode != 0) {
throw new RuntimeException("Illegal exit code. Expected: [0]. Actual: [" + exitCode + "]");
}

} catch (IOException | InterruptedException e) {
LOGGER.error("Error occurred while running SparkApplication.", e);
throw new RuntimeException(e);
} finally {
if (sparkProcess != null && sparkProcess.isAlive()) {
LOGGER.warn("Process still alive. Try to kill");
sparkProcess.destroy();
}
executionInProgress.set(false);
}
}

我已经启动了一个docker容器,其中下载了spark 2.3.0-hadoop6。测试人员需要这个容器。 我将 master 更改为 .setMaster("local"),将新配置文件的路径添加到sparkHome、jarsWithOpertations 并打包没有阴影的 jar(尝试使用阴影,但它对我不起作用)。当我尝试运行我的 SparkLaunch 应用程序时,我现在遇到了一个异常:

2018-08-06 14:47:53,150 INFO [n.m.m.b.r.SparkBaseOperationsRunner.runSparkJob] 105 : Create SparkLauncher. SparkHome: [/opt/bigtv/spark/spark-2.3.0-bin-hadoop2.6]. JarPath: [/opt/bigtv/bin/multirating-bigdata-operations-MASTER-SNAPSHOT.jar]. 2018-08-06 14:47:54,905 INFO [o.a.spark.launcher.OutputRedirector.redirect] 63 : 2018-08-06 14:47:54 WARN NativeCodeLoader:62 - Unable to load native-hadoop library for your platform... using builtin-java classes where applicable 2018-08-06 14:47:57,042 INFO [o.a.spark.launcher.OutputRedirector.redirect] 63 : 2018-08-06 14:47:57 ERROR SpringApplication:842 - Application run failed 2018-08-06 14:47:57,043 INFO [o.a.spark.launcher.OutputRedirector.redirect] 63 : java.lang.NoSuchMethodError: org.yaml.snakeyaml.Yaml.(Lorg/yaml/snakeyaml/constructor/BaseConstructor;Lorg/yaml/snakeyaml/representer/Representer;Lorg/yaml/snakeyaml/DumperOptions;Lorg/yaml/snakeyaml/LoaderOptions;Lorg/yaml/snakeyaml/resolver/Resolver;)V 2018-08-06 14:47:57,043 INFO [o.a.spark.launcher.OutputRedirector.redirect] 63 : at org.springframework.boot.env.OriginTrackedYamlLoader.createYaml(OriginTrackedYamlLoader.java:70) 2018-08-06 14:47:57,043 INFO [o.a.spark.launcher.OutputRedirector.redirect] 63 : at org.springframework.beans.factory.config.YamlProcessor.process(YamlProcessor.java:139) 2018-08-06 14:47:57,044 INFO [o.a.spark.launcher.OutputRedirector.redirect] 63 : at org.springframework.boot.env.OriginTrackedYamlLoader.load(OriginTrackedYamlLoader.java:75) 2018-08-06 14:47:57,044 INFO [o.a.spark.launcher.OutputRedirector.redirect] 63 : at org.springframework.boot.env.YamlPropertySourceLoader.load(YamlPropertySourceLoader.java:50) 2018-08-06 14:47:57,044 INFO [o.a.spark.launcher.OutputRedirector.redirect] 63 : at org.springframework.boot.context.config.ConfigFileApplicationListener$Loader.loadDocuments(ConfigFileApplicationListener.java:547) 2018-08-06 14:47:57,044 INFO [o.a.spark.launcher.OutputRedirector.redirect] 63 : at org.springframework.boot.context.config.ConfigFileApplicationListener$Loader.load(ConfigFileApplicationListener.java:517) 2018-08-06 14:47:57,045 INFO [o.a.spark.launcher.OutputRedirector.redirect] 63 : at org.springframework.boot.context.config.ConfigFileApplicationListener$Loader.loadForFileExtension(ConfigFileApplicationListener.java:496) 2018-08-06 14:47:57,045 INFO [o.a.spark.launcher.OutputRedirector.redirect] 63 : at org.springframework.boot.context.config.ConfigFileApplicationListener$Loader.load(ConfigFileApplicationListener.java:464) 2018-08-06 14:47:57,045 INFO [o.a.spark.launcher.OutputRedirector.redirect] 63 : at org.springframework.boot.context.config.ConfigFileApplicationListener$Loader.lambda$null$6(ConfigFileApplicationListener.java:446) 2018-08-06 14:47:57,046 INFO [o.a.spark.launcher.OutputRedirector.redirect] 63 : at java.lang.Iterable.forEach(Iterable.java:75) 2018-08-06 14:47:57,046 INFO [o.a.spark.launcher.OutputRedirector.redirect] 63 : at org.springframework.boot.context.config.ConfigFileApplicationListener$Loader.lambda$load$7(ConfigFileApplicationListener.java:445) 2018-08-06 14:47:57,046 INFO [o.a.spark.launcher.OutputRedirector.redirect] 63 : at java.lang.Iterable.forEach(Iterable.java:75) 2018-08-06 14:47:57,046 INFO [o.a.spark.launcher.OutputRedirector.redirect] 63 : at org.springframework.boot.context.config.ConfigFileApplicationListener$Loader.load(ConfigFileApplicationListener.java:442) 2018-08-06 14:47:57,046 INFO [o.a.spark.launcher.OutputRedirector.redirect] 63 : at org.springframework.boot.context.config.ConfigFileApplicationListener$Loader.load(ConfigFileApplicationListener.java:330) 2018-08-06 14:47:57,047 INFO [o.a.spark.launcher.OutputRedirector.redirect] 63 : at org.springframework.boot.context.config.ConfigFileApplicationListener.addPropertySources(ConfigFileApplicationListener.java:212) 2018-08-06 14:47:57,047 INFO [o.a.spark.launcher.OutputRedirector.redirect] 63 : at org.springframework.boot.context.config.ConfigFileApplicationListener.postProcessEnvironment(ConfigFileApplicationListener.java:195) 2018-08-06 14:47:57,047 INFO [o.a.spark.launcher.OutputRedirector.redirect] 63 : at org.springframework.boot.context.config.ConfigFileApplicationListener.onApplicationEnvironmentPreparedEvent(ConfigFileApplicationListener.java:182) 2018-08-06 14:47:57,047 INFO [o.a.spark.launcher.OutputRedirector.redirect] 63 : at org.springframework.boot.context.config.ConfigFileApplicationListener.onApplicationEvent(ConfigFileApplicationListener.java:168) 2018-08-06 14:47:57,048 INFO [o.a.spark.launcher.OutputRedirector.redirect] 63 : at org.springframework.context.event.SimpleApplicationEventMulticaster.doInvokeListener(SimpleApplicationEventMulticaster.java:172) 2018-08-06 14:47:57,048 INFO [o.a.spark.launcher.OutputRedirector.redirect] 63 : at org.springframework.context.event.SimpleApplicationEventMulticaster.invokeListener(SimpleApplicationEventMulticaster.java:165) 2018-08-06 14:47:57,048 INFO [o.a.spark.launcher.OutputRedirector.redirect] 63 : at org.springframework.context.event.SimpleApplicationEventMulticaster.multicastEvent(SimpleApplicationEventMulticaster.java:139) 2018-08-06 14:47:57,048 INFO [o.a.spark.launcher.OutputRedirector.redirect] 63 : at org.springframework.context.event.SimpleApplicationEventMulticaster.multicastEvent(SimpleApplicationEventMulticaster.java:127) 2018-08-06 14:47:57,049 INFO [o.a.spark.launcher.OutputRedirector.redirect] 63 : at org.springframework.boot.context.event.EventPublishingRunListener.environmentPrepared(EventPublishingRunListener.java:74) 2018-08-06 14:47:57,049 INFO [o.a.spark.launcher.OutputRedirector.redirect] 63 : at org.springframework.boot.SpringApplicationRunListeners.environmentPrepared(SpringApplicationRunListeners.java:54) 2018-08-06 14:47:57,049 INFO [o.a.spark.launcher.OutputRedirector.redirect] 63 : at org.springframework.boot.SpringApplication.prepareEnvironment(SpringApplication.java:358) 2018-08-06 14:47:57,049 INFO [o.a.spark.launcher.OutputRedirector.redirect] 63 : at org.springframework.boot.SpringApplication.run(SpringApplication.java:317) 2018-08-06 14:47:57,050 INFO [o.a.spark.launcher.OutputRedirector.redirect] 63 : at org.springframework.boot.SpringApplication.run(SpringApplication.java:1255) 2018-08-06 14:47:57,050 INFO [o.a.spark.launcher.OutputRedirector.redirect] 63 : at org.springframework.boot.SpringApplication.run(SpringApplication.java:1243) 2018-08-06 14:47:57,050 INFO [o.a.spark.launcher.OutputRedirector.redirect] 63 : at net.mediascope.multirating.bigdata.operations.OperationRunner.main(OperationRunner.java:21) 2018-08-06 14:47:57,050 INFO [o.a.spark.launcher.OutputRedirector.redirect] 63 : at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) 2018-08-06 14:47:57,050 INFO [o.a.spark.launcher.OutputRedirector.redirect] 63 : at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62) 2018-08-06 14:47:57,051 INFO [o.a.spark.launcher.OutputRedirector.redirect] 63 : at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) 2018-08-06 14:47:57,051 INFO [o.a.spark.launcher.OutputRedirector.redirect] 63 : at java.lang.reflect.Method.invoke(Method.java:498) 2018-08-06 14:47:57,051 INFO [o.a.spark.launcher.OutputRedirector.redirect] 63 : at org.springframework.boot.loader.MainMethodRunner.run(MainMethodRunner.java:48) 2018-08-06 14:47:57,051 INFO [o.a.spark.launcher.OutputRedirector.redirect] 63 : at org.springframework.boot.loader.Launcher.launch(Launcher.java:87) 2018-08-06 14:47:57,052 INFO [o.a.spark.launcher.OutputRedirector.redirect] 63 : at org.springframework.boot.loader.Launcher.launch(Launcher.java:50) 2018-08-06 14:47:57,052 INFO [o.a.spark.launcher.OutputRedirector.redirect] 63 : at org.springframework.boot.loader.JarLauncher.main(JarLauncher.java:51) 2018-08-06 14:47:57,052 INFO [o.a.spark.launcher.OutputRedirector.redirect] 63 : at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) 2018-08-06 14:47:57,052 INFO [o.a.spark.launcher.OutputRedirector.redirect] 63 : at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62) 2018-08-06 14:47:57,053 INFO [o.a.spark.launcher.OutputRedirector.redirect] 63 : at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) 2018-08-06 14:47:57,053 INFO [o.a.spark.launcher.OutputRedirector.redirect] 63 : at java.lang.reflect.Method.invoke(Method.java:498) 2018-08-06 14:47:57,053 INFO [o.a.spark.launcher.OutputRedirector.redirect] 63 : at org.apache.spark.deploy.JavaMainApplication.start(SparkApplication.scala:52) 2018-08-06 14:47:57,053 INFO [o.a.spark.launcher.OutputRedirector.redirect] 63 : at org.apache.spark.deploy.SparkSubmit$.org$apache$spark$deploy$SparkSubmit$$runMain(SparkSubmit.scala:879) 2018-08-06 14:47:57,054 INFO [o.a.spark.launcher.OutputRedirector.redirect] 63 : at org.apache.spark.deploy.SparkSubmit$.doRunMain$1(SparkSubmit.scala:197) 2018-08-06 14:47:57,054 INFO [o.a.spark.launcher.OutputRedirector.redirect] 63 : at org.apache.spark.deploy.SparkSubmit$.submit(SparkSubmit.scala:227) 2018-08-06 14:47:57,054 INFO [o.a.spark.launcher.OutputRedirector.redirect] 63 : at org.apache.spark.deploy.SparkSubmit$.main(SparkSubmit.scala:136) 2018-08-06 14:47:57,054 INFO [o.a.spark.launcher.OutputRedirector.redirect] 63 : at org.apache.spark.deploy.SparkSubmit.main(SparkSubmit.scala) 2018-08-06 14:47:57,058 INFO [o.a.spark.launcher.OutputRedirector.redirect] 63 : 2018-08-06 14:47:57 INFO ShutdownHookManager:54 - Shutdown hook called 2018-08-06 14:47:57,060 INFO [o.a.spark.launcher.OutputRedirector.redirect] 63 : 2018-08-06 14:47:57 INFO ShutdownHookManager:54 - Deleting directory /tmp/spark-55b54924-e628-43fe-9e43-ed34d7f35a8b 2018-08-06 14:47:57,151 INFO [o.s.b.a.l.ConditionEvaluationReportLoggingListener.logAutoConfigurationReport] 101 :

Error starting ApplicationContext. To display the conditions report re-run your application with 'debug' enabled.

在我的项目中,我有来自 Spring 5.0 的 Snake yaml 1.19,没有其他依赖项。我无法理解问题是什么,也许当我将其放入 docker 容器手册时,除了 Spark 之外还需要安装其他东西。

来自模块的 Pom 和操作:

<dependencies>
<dependency>
<groupId>net.mediascope</groupId>
<artifactId>multirating-bigdata-core</artifactId>
<version>${project.version}</version>
</dependency>
<dependency>
<groupId>org.springframework.boot</groupId>
<artifactId>spring-boot-starter-log4j2</artifactId>
</dependency>
<!-- Data Base -->
<dependency>
<groupId>org.jdbi</groupId>
<artifactId>jdbi</artifactId>
<version>2.71</version>
</dependency>

<dependency>
<groupId>com.microsoft.sqlserver</groupId>
<artifactId>sqljdbc42</artifactId>
<version>4.2</version>
</dependency>

<dependency>
<groupId>org.apache.spark</groupId>
<artifactId>spark-core_2.11</artifactId>
<exclusions>
<exclusion>
<groupId>org.slf4j</groupId>
<artifactId>slf4j-log4j12</artifactId>
</exclusion>
<exclusion>
<groupId>org.codehaus.janino</groupId>
<artifactId>commons-compiler</artifactId>
</exclusion>
</exclusions>
</dependency>
<dependency>
<groupId>org.apache.spark</groupId>
<artifactId>spark-sql_2.11</artifactId>
</dependency>
<dependency>
<groupId>net.sourceforge.jtds</groupId>
<artifactId>jtds</artifactId>
<version>1.3.1</version>
</dependency>
</dependencies>

<profiles>
<profile>
<id>local</id>
<build>
<plugins>
<plugin>
<groupId>org.springframework.boot</groupId>
<artifactId>spring-boot-maven-plugin</artifactId>
<configuration>
<profiles>
<profile>${profile.active}</profile>
</profiles>
<executable>true</executable>
</configuration>
</plugin>
</plugins>
</build>
</profile>
<profile>
<id>hadoop</id>
<build>
<!--Необходимо для адаптации Spring-Boot приложения под запуск через Spark-->
<plugins>
<plugin>
<groupId>org.apache.maven.plugins</groupId>
<artifactId>maven-shade-plugin</artifactId>
<version>2.3</version>
<executions>
<execution>
<phase>package</phase>
<goals>
<goal>shade</goal>
</goals>
<configuration>
<transformers>
<transformer
implementation="org.apache.maven.plugins.shade.resource.AppendingTransformer">
<resource>META-INF/spring.handlers</resource>
</transformer>
<transformer
implementation="org.apache.maven.plugins.shade.resource.AppendingTransformer">
<resource>META-INF/spring.schemas</resource>
</transformer>
<transformer
implementation="org.apache.maven.plugins.shade.resource.AppendingTransformer">
<resource>META-INF/spring.provides</resource>
</transformer>
<transformer
implementation="org.springframework.boot.maven.PropertiesMergingResourceTransformer">
<resource>META-INF/spring.factories</resource>
</transformer>
<transformer
implementation="org.apache.maven.plugins.shade.resource.ManifestResourceTransformer">
<mainClass>${start-class}</mainClass>
</transformer>
</transformers>
</configuration>
</execution>
</executions>
</plugin>
</plugins>
</build>
</profile>
</profiles>I

最佳答案

我已经找到解决方案了。Origin Spark 包有文件夹 jars,其中有 Snakeyml 1.15,我将其更改为 1.19,现在一切正常。

关于java - Spark 启动器。 java.lang.NoSuchMethodError : org. yaml.snakeyaml.Yaml.<init>,我们在Stack Overflow上找到一个类似的问题: https://stackoverflow.com/questions/51710753/

29 4 0
Copyright 2021 - 2024 cfsdn All Rights Reserved 蜀ICP备2022000587号
广告合作:1813099741@qq.com 6ren.com