gpt4 book ai didi

java - 尝试使用 flink Kafka Consumer 消费时出现错误 "java.lang.NoSuchMethodError: org.apache.kafka.clients.consumer.KafkaConsumer.assign"

转载 作者:行者123 更新时间:2023-11-30 07:46:45 25 4
gpt4 key购买 nike

我正在尝试编写一个 Kafka 消费者,它使用主题中的数据。但是每当我尝试运行它时,我都会收到以下错误。

Exception in thread "main" org.apache.flink.runtime.client.JobExecutionException: Job execution failed.
at org.apache.flink.runtime.jobmanager.JobManager$$anonfun$handleMessage$1$$anonfun$applyOrElse$6.apply$mcV$sp(JobManager.scala:897)
at org.apache.flink.runtime.jobmanager.JobManager$$anonfun$handleMessage$1$$anonfun$applyOrElse$6.apply(JobManager.scala:840)
at org.apache.flink.runtime.jobmanager.JobManager$$anonfun$handleMessage$1$$anonfun$applyOrElse$6.apply(JobManager.scala:840)
at scala.concurrent.impl.Future$PromiseCompletingRunnable.liftedTree1$1(Future.scala:24)
at scala.concurrent.impl.Future$PromiseCompletingRunnable.run(Future.scala:24)
at akka.dispatch.TaskInvocation.run(AbstractDispatcher.scala:39)
at akka.dispatch.ForkJoinExecutorConfigurator$AkkaForkJoinTask.exec(AbstractDispatcher.scala:415)
at scala.concurrent.forkjoin.ForkJoinTask.doExec(ForkJoinTask.java:260)
at scala.concurrent.forkjoin.ForkJoinPool$WorkQueue.runTask(ForkJoinPool.java:1339)
at scala.concurrent.forkjoin.ForkJoinPool.runWorker(ForkJoinPool.java:1979)
at scala.concurrent.forkjoin.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:107)
Caused by: java.lang.NoSuchMethodError: org.apache.kafka.clients.consumer.KafkaConsumer.assign(Ljava/util/List;)V
at org.apache.flink.streaming.connectors.kafka.internal.KafkaConsumerCallBridge.assignPartitions(KafkaConsumerCallBridge.java:39)
at org.apache.flink.streaming.connectors.kafka.internal.KafkaConsumerThread.reassignPartitions(KafkaConsumerThread.java:391)
at org.apache.flink.streaming.connectors.kafka.internal.KafkaConsumerThread.run(KafkaConsumerThread.java:229)

Java 类是:

import org.apache.flink.api.java.utils.ParameterTool;
import org.apache.flink.streaming.api.datastream.DataStream;
import org.apache.flink.streaming.api.environment.StreamExecutionEnvironment;
import org.apache.flink.streaming.connectors.kafka.FlinkKafkaConsumer09;
import org.apache.flink.streaming.util.serialization.SimpleStringSchema;


public class KafkaConsumer {
public static void main(String[] args) throws Exception{
StreamExecutionEnvironment env = StreamExecutionEnvironment.getExecutionEnvironment();
ParameterTool parameterTool = ParameterTool.fromArgs(args);

DataStream<String> stream = env.addSource(new FlinkKafkaConsumer09<String>("rdf-new", new SimpleStringSchema(), parameterTool.getProperties()));

stream.print();
env.execute();
}}

我用相同的代码在 intellij 中创建了一个独立的项目(有自己的 pom)并且它工作正常但是因为我需要另一个项目中的代码我在已经存在的项目中创建了一个新的 maven 模块然后尝试运行它,现在它向我展示了这个错误。

maven 模块的 pom.xml 中的依赖项是:

<properties>
<project.build.sourceEncoding>UTF-8</project.build.sourceEncoding>
<flink.version>1.4.2</flink.version>
<java.version>1.8</java.version>
<scala.binary.version>2.11</scala.binary.version>
<maven.compiler.source>${java.version}</maven.compiler.source>
<maven.compiler.target>${java.version}</maven.compiler.target>
</properties>

<dependencies>
<dependency>
<groupId>org.apache.flink</groupId>
<artifactId>flink-java</artifactId>
<version>${flink.version}</version>
</dependency>
<dependency>
<groupId>org.apache.flink</groupId>
<artifactId>flink-streaming-java_2.11</artifactId>
<version>${flink.version}</version>
</dependency>
<dependency>
<groupId>org.apache.flink</groupId>
<artifactId>flink-clients_2.11</artifactId>
<version>${flink.version}</version>
</dependency>
<dependency>
<groupId>org.apache.flink</groupId>
<artifactId>flink-core</artifactId>
<version>${flink.version}</version>
</dependency>
<dependency>
<groupId>org.apache.flink</groupId>
<artifactId>flink-connector-kafka-0.9_2.11</artifactId>
<version>${flink.version}</version>
</dependency>
<dependency>
<groupId>org.slf4j</groupId>
<artifactId>slf4j-log4j12</artifactId>
<version>1.7.7</version>
<scope>runtime</scope>
</dependency>
<dependency>
<groupId>log4j</groupId>
<artifactId>log4j</artifactId>
<version>1.2.17</version>
<scope>runtime</scope>
</dependency>
<dependency>
<groupId>org.apache.flink</groupId>
<artifactId>flink-cep_2.11</artifactId>
<version>${flink.version}</version>
</dependency>
</dependencies>

我唯一注意到的是在 maven 模块中我得到的 KafkaVersion 是 1.1.0,但是 pom 有 KafkaConnector“flink-connector-kafka-0.9_2.11”

2018-05-18 11:14:56,105 - AbstractConfig                           [WARN] - ConsumerConfig            - The configuration 'zookeeper.connect' was supplied but isn't a known config. 
2018-05-18 11:14:56,105 - AppInfoParser$AppInfo [INFO] - AppInfoParser - Kafka version : 1.1.0
2018-05-18 11:14:56,105 - AppInfoParser$AppInfo [INFO] - AppInfoParser - Kafka commitId : fdcf75ea326b8e07

在独立项目(消费者正常工作的地方)中,Kafka 版本为 0.9.0.1。

11:32:19,537 WARN  org.apache.kafka.clients.consumer.ConsumerConfig              - The configuration zookeeper.connect = localhost:2181 was supplied but isn't a known config.
11:32:19,537 INFO org.apache.kafka.common.utils.AppInfoParser - Kafka version : 0.9.0.1
11:32:19,538 INFO org.apache.kafka.common.utils.AppInfoParser - Kafka commitId : 23c69d62a0cabf06

如果有人能告诉我可能是什么问题,那将是一个巨大的帮助?这可能是因为 pom 文件中的依赖关系,但在独立项目中,它也具有与我提供的相同的依赖关系。提前致谢。

最佳答案

正如您已经发现的那样,问题在于您的模块中的 kafka 版本 (1.0) 与 flink 连接器期望的版本 (0.9) 不匹配。

你可以这样做:

mvn dependency:tree

在命令行上找出kafka客户端依赖版本来自哪里。

在您模块的 pom 中,您可以添加一个 dependencyManagement 部分来将 kafka 客户端库依赖项版本覆盖为您需要的版本,如下所示:

<dependencyManagement>
<dependency>
<groupId>org.apache.kafka</groupId>
<artifactId>kafka-clients</artifactId>
<version>0.9.0.1</version>
</dependency>
</dependencyManagement>

关于java - 尝试使用 flink Kafka Consumer 消费时出现错误 "java.lang.NoSuchMethodError: org.apache.kafka.clients.consumer.KafkaConsumer.assign",我们在Stack Overflow上找到一个类似的问题: https://stackoverflow.com/questions/50408315/

25 4 0
Copyright 2021 - 2024 cfsdn All Rights Reserved 蜀ICP备2022000587号
广告合作:1813099741@qq.com 6ren.com