gpt4 book ai didi

java - Flink 应用程序在 Java 中抛出类未找到异常

转载 作者:行者123 更新时间:2023-11-30 12:03:58 26 4
gpt4 key购买 nike

我有一个带有 Yarn 的 Flink 集群,使用 flink-quickstart-java Archetype 构建一个演示项目。使用“mvn clean package -Pbuild-jar”命令构建一个 fat-jar 并使用“flink run -m yarn-cluster -yn 2 ./flink-SNAPSHOT-1.0.jar”提交程序后,程序抛出以下内容异常:

java.lang.NoClassDefFoundError: java.lang.NoClassDefFoundError: org/apache/kafka/common/serialization/ByteArrayDeserializer at org.apache.flink.streaming.connectors.kafka.FlinkKafkaConsumer09.setDeserializer(FlinkKafkaConsumer09.java:290) at org.apache.flink.streaming.connectors.kafka.FlinkKafkaConsumer09.(FlinkKafkaConsumer09.java:216) at org.apache.flink.streaming.connectors.kafka.FlinkKafkaConsumer09.(FlinkKafkaConsumer09.java:154) at org.apache.flink.streaming.connectors.kafka.FlinkKafkaConsumer010.(FlinkKafkaConsumer010.java:128) at org.apache.flink.streaming.connectors.kafka.FlinkKafkaConsumer010.(FlinkKafkaConsumer010.java:112) at org.apache.flink.streaming.connectors.kafka.FlinkKafkaConsumer010.(FlinkKafkaConsumer010.java:79) at stream.TransferKafka.main(TransferKafka.java:19) at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at org.apache.flink.client.program.PackagedProgram.callMainMethod(PackagedProgram.java:525) at org.apache.flink.client.program.PackagedProgram.invokeInteractiveModeForExecution(PackagedProgram.java:417) at org.apache.flink.client.program.ClusterClient.run(ClusterClient.java:395) at org.apache.flink.client.CliFrontend.executeProgram(CliFrontend.java:828) at org.apache.flink.client.CliFrontend.run(CliFrontend.java:283) at org.apache.flink.client.CliFrontend.parseParameters(CliFrontend.java:1080) at org.apache.flink.client.CliFrontend$1.call(CliFrontend.java:1127) at org.apache.flink.client.CliFrontend$1.call(CliFrontend.java:1124) at java.security.AccessController.doPrivileged(Native Method) at javax.security.auth.Subject.doAs(Subject.java:422) at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1781) at org.apache.flink.runtime.security.HadoopSecurityContext.runSecured(HadoopSecurityContext.java:41) at org.apache.flink.client.CliFrontend.main(CliFrontend.java:1124) Caused by: java.lang.ClassNotFoundException: org.apache.kafka.common.serialization.ByteArrayDeserializer at java.net.URLClassLoader.findClass(URLClassLoader.java:381) at java.lang.ClassLoader.loadClass(ClassLoader.java:424) at sun.misc.Launcher$AppClassLoader.loadClass(Launcher.java:338) at java.lang.ClassLoader.loadClass(ClassLoader.java:357) ... 24 more

这是我的演示:

public static void main(String[] args) {
final StreamExecutionEnvironment env = StreamExecutionEnvironment.getExecutionEnvironment();
Properties props = new Properties();
props.setProperty("bootstrap.servers", "ip:port");
props.setProperty("group.id", "NewFlinkTest");
DataStreamSource < String > stream = env.addSource(new FlinkKafkaConsumer010 < > ("kafka_test", new SimpleStringSchema(), props));
stream.addSink(new FlinkKafkaProducer010 < > ("kafka_test_out", new SimpleStringSchema(), props));
try {
env.execute("Flink Jar Test");
} catch (Exception e) {
e.printStackTrace();
}
}

以及一些版本信息:
Flink 版本:1.4.0

Hadoop 版本:2.7.2

卡夫卡版本:0.10.2.1

JDK 版本:1.8


Pom 依赖

编辑1:

<?xml version="1.0" encoding="UTF-8"?>
<dependencies>
<!-- Apache Flink dependencies -->
<dependency>
<groupId>org.apache.flink</groupId>
<artifactId>flink-core</artifactId>
<version>${flink.version}</version>
</dependency>
<dependency>
<groupId>org.apache.flink</groupId>
<artifactId>flink-java</artifactId>
<version>${flink.version}</version>
</dependency>
<dependency>
<!-- This dependency is required to actually execute jobs. It is currently pulled in by flink-streaming-java, but we explicitly depend on it to safeguard against future changes. -->
<groupId>org.apache.flink</groupId>
<artifactId>flink-clients_${scala.binary.version}</artifactId>
<version>${flink.version}</version>
</dependency>
<dependency>
<groupId>org.apache.flink</groupId>
<artifactId>flink-streaming-java_${scala.binary.version}</artifactId>
<version>${flink.version}</version>
</dependency>
<!-- explicitly add a standard logging framework, as Flink does not have a hard dependency on one specific framework by default -->
<dependency>
<groupId>org.slf4j</groupId>
<artifactId>slf4j-log4j12</artifactId>
<version>${slf4j.version}</version>
</dependency>
<dependency>
<groupId>log4j</groupId>
<artifactId>log4j</artifactId>
<version>${log4j.version}</version>
</dependency>
<dependency>
<groupId>org.apache.flink</groupId>
<artifactId>flink-connector-rabbitmq_2.11</artifactId>
<version>1.4.0</version>
</dependency>
<dependency>
<groupId>org.apache.flink</groupId>
<artifactId>flink-connector-kafka-0.10_${scala.binary.version}</artifactId>
<version>1.4.0</version>
</dependency>
</dependencies>

经过一些尝试,我发现抛出异常的代码与我打包到我的 uber-jar 中的 jar 不同。我认为主要原因是客户端服务器有旧版本的 flink-connector-kafka jar,但无论我如何设置配置 yaml 属性 'yarn.per-job-cluster.include-user-jar',程序总是抛出相同的异常。


编辑2:

将kafka-clients:0.10.2.1添加到flink_home/lib/后就可以了。但仍然不知道它不读取 uber jar 中的类文件的原因。

最佳答案

首先,您可以通过 grep 'ByteArrayDeserializer' ./flink-SNAPSHOT-1.0.jar 验证您的 jar 文件中是否存在缺失的类。

关于java - Flink 应用程序在 Java 中抛出类未找到异常,我们在Stack Overflow上找到一个类似的问题: https://stackoverflow.com/questions/57351923/

26 4 0
Copyright 2021 - 2024 cfsdn All Rights Reserved 蜀ICP备2022000587号
广告合作:1813099741@qq.com 6ren.com