gpt4 book ai didi

scala - 如何在 Scala 中编写 Kafka Producer

转载 作者:行者123 更新时间:2023-12-04 00:57:27 25 4
gpt4 key购买 nike

我需要帮助使用 kafka 生产者将消息发布到主题。
我的 kafka 生产者客户端是用在 spark 上运行的 Scala 编写的。

我的工作运行成功,但我的消息似乎没有发布。

这是代码

val response = info.producer.asInstanceOf[KafkaProducer[K, V]].send(new ProducerRecord(info.props.getProperty(s"$topicNickName.topic"), keyMessage._1, keyMessage._2))

生产者配置值
metric.reporters = []
metadata.max.age.ms = 300000
reconnect.backoff.ms = 50
sasl.kerberos.ticket.renew.window.factor = 0.8
bootstrap.servers = [x.data.edh:6667, y.data.edh:6667, z.data.edh:6667, a.data.edh:6667, b.data.edh:6667]
ssl.keystore.type = JKS
sasl.mechanism = GSSAPI
max.block.ms = 60000
interceptor.classes = null
ssl.truststore.password = null
client.id =
ssl.endpoint.identification.algorithm = null
request.timeout.ms = 30000
acks = 1
receive.buffer.bytes = 32768
ssl.truststore.type = JKS
retries = 0
ssl.truststore.location = null
ssl.keystore.password = null
send.buffer.bytes = 131072
compression.type = none
metadata.fetch.timeout.ms = 60000
retry.backoff.ms = 100
sasl.kerberos.kinit.cmd = /usr/bin/kinit
buffer.memory = 33554432
timeout.ms = 30000
key.serializer = class org.apache.kafka.common.serialization.StringSerializer
sasl.kerberos.service.name = null
sasl.kerberos.ticket.renew.jitter = 0.05
ssl.trustmanager.algorithm = PKIX
block.on.buffer.full = false
ssl.key.password = null
sasl.kerberos.min.time.before.relogin = 60000
connections.max.idle.ms = 540000
max.in.flight.requests.per.connection = 5
metrics.num.samples = 2
ssl.protocol = TLS
ssl.provider = null
ssl.enabled.protocols = [TLSv1.2, TLSv1.1, TLSv1]
batch.size = 16384
ssl.keystore.location = null
ssl.cipher.suites = null
security.protocol = PLAINTEXT
max.request.size = 1048576
value.serializer = class org.apache.kafka.common.serialization.StringSerializer
ssl.keymanager.algorithm = SunX509
metrics.sample.window.ms = 30000
partitioner.class = class org.apache.kafka.clients.producer.internals.DefaultPartitioner
linger.ms = 0

我如何调试问题?

最佳答案

以下是如何在 Scala 中向 Kafka 生成消息的示例:

import org.apache.kafka.clients.producer.{KafkaProducer, ProducerRecord}
import org.apache.kafka.common.serialization.StringSerializer

val kafkaProducerProps: Properties = {
val props = new Properties()
props.put("bootstrap.servers", "x.data.edh:6667")
props.put("key.serializer", classOf[StringSerializer].getName)
props.put("value.serializer", classOf[StringSerializer].getName)
props
}

val producer = new KafkaProducer[String, String](kafkaProducerProps)
producer.send(new ProducerRecord[String, String]("myTopic", keyMessage._1, keyMessage._2))

如果您想进行流式传输,我建议您查看 Spark + Kafka integration Guide .

请注意,上面给出的示例是处于即发即忘模式的 KafkaProducer。 Kafka Producer的使用方式有以下几种:

Fire-and-forget We send a message to the server and don’t really care if it arrives succesfully or not. Most of the time, it will arrive successfully, since Kafka is highly available and the producer will retry sending messages automatically. However, some messages will get lost using this method.

Synchronous send We send a message, the send() method returns a Future object, and we use get() to wait on the future and see if the send() was successful or not.

Asynchronous send We call the send() method with a callback function, which gets triggered when it receives a response from the Kafka broker



同步示例

producer.send(record).get()

异步示例

producer.send(record, new compareProducerCallback)
producer.flush()

// Callback trait only contains the one abstract method onCompletion
private class compareProducerCallback extends Callback {
@Override
override def onCompletion(metadata: RecordMetadata, exception: Exception): Unit = {
if (exception != null) {
exception.printStackTrace()
}
}
}

关于scala - 如何在 Scala 中编写 Kafka Producer,我们在Stack Overflow上找到一个类似的问题: https://stackoverflow.com/questions/61285666/

25 4 0
Copyright 2021 - 2024 cfsdn All Rights Reserved 蜀ICP备2022000587号
广告合作:1813099741@qq.com 6ren.com