gpt4 book ai didi

scala - 创建Kafka流的AbstractMethodError

转载 作者:行者123 更新时间:2023-12-03 21:23:27 25 4
gpt4 key购买 nike

我正在尝试使用createDirectStream方法打开一个Kafka(试用版0.11.0.2和1.0.1)流,并收到以下AbstractMethodError错误:

Exception in thread "main" java.lang.AbstractMethodError
at org.apache.spark.internal.Logging$class.initializeLogIfNecessary(Logging.scala:99)
at org.apache.spark.streaming.kafka010.KafkaUtils$.initializeLogIfNecessary(KafkaUtils.scala:39)
at org.apache.spark.internal.Logging$class.log(Logging.scala:46)
at org.apache.spark.streaming.kafka010.KafkaUtils$.log(KafkaUtils.scala:39)
at org.apache.spark.internal.Logging$class.logWarning(Logging.scala:66)
at org.apache.spark.streaming.kafka010.KafkaUtils$.logWarning(KafkaUtils.scala:39)
at org.apache.spark.streaming.kafka010.KafkaUtils$.fixKafkaParams(KafkaUtils.scala:201)
at org.apache.spark.streaming.kafka010.DirectKafkaInputDStream.<init>(DirectKafkaInputDStream.scala:63)
at org.apache.spark.streaming.kafka010.KafkaUtils$.createDirectStream(KafkaUtils.scala:147)
at org.apache.spark.streaming.kafka010.KafkaUtils$.createDirectStream(KafkaUtils.scala:124)


这就是我所说的:

val preferredHosts = LocationStrategies.PreferConsistent
val kafkaParams = Map(
"bootstrap.servers" -> "localhost:9092",
"key.deserializer" -> classOf[IntegerDeserializer],
"value.deserializer" -> classOf[StringDeserializer],
"group.id" -> groupId,
"auto.offset.reset" -> "earliest"
)

val aCreatedStream = createDirectStream[String, String](ssc, preferredHosts,
ConsumerStrategies.Subscribe[String, String](topics, kafkaParams))


我让Kafka运行在9092上,并且我能够创建生产者和消费者,并在他们之间传递消息,所以不确定为什么它不能从Scala代码中使用。任何想法表示赞赏。

最佳答案

原来我使用的是Spark 2.3,而我应该使用的是Spark 2.2。显然,该方法在以后的版本中被抽象化了,所以我遇到了这个错误。

关于scala - 创建Kafka流的AbstractMethodError,我们在Stack Overflow上找到一个类似的问题: https://stackoverflow.com/questions/49180931/

25 4 0
Copyright 2021 - 2024 cfsdn All Rights Reserved 蜀ICP备2022000587号
广告合作:1813099741@qq.com 6ren.com