gpt4 book ai didi

apache-kafka - Debezium Kafka 连接错误 - TimeoutException : Timeout expired while fetching topic metadata

转载 作者:行者123 更新时间:2023-12-05 07:14:31 34 4
gpt4 key购买 nike

我在 Debezium connect producer 上遇到错误,并且不确定我在哪里犯了错误或遗漏了什么。下面是我的连接器和 docker 文件的属性。是否有可能部署在 VM 上的 docker 无法将其连接到其他 VM 上的数据库?

kafka-connect-10    | [2020-01-23 23:37:00,202] ERROR [Procura_CDC|task-0] WorkerSourceTask{id=Procura_CDC-0} Task threw an uncaught and unrecoverable exception (org.apache.kafka.connect.runtime.WorkerTask:179)
kafka-connect-10 | org.apache.kafka.common.errors.TimeoutException: Timeout expired while fetching topic metadata
kafka-connect-10 | [2020-01-23 23:37:00,203] ERROR [Procura_CDC|task-0] WorkerSourceTask{id=Procura_CDC-0} Task is being killed and will not recover until manually restarted (org.apache.kafka.connect.runtime.WorkerTask:180)
kafka-connect-10 | [2020-01-23 23:37:00,205] INFO [Procura_CDC|task-0] [Producer clientId=procura-dbhistory] Closing the Kafka producer with timeoutMillis = 9223372036854775807 ms. (org.apache.kafka.clients.producer.KafkaProducer:1183)
kafka-connect-10 | [2020-01-23 23:37:00,374] INFO [Procura_CDC|task-0] [Producer clientId=connector-producer-Procura_CDC-0] Closing the Kafka producer with timeoutMillis = 30000 ms. (org.apache.kafka.clients.producer.KafkaProducer:1183)
kafka-connect-10 | [2020-01-23 23:37:12,772] INFO [Procura_CDC|task-0|offsets] WorkerSourceTask{id=Procura_CDC-0} Committing offsets (org.apache.kafka.connect.runtime.WorkerSourceTask:416)
kafka-connect-10 | [2020-01-23 23:37:12,773] INFO [Procura_CDC|task-0|offsets] WorkerSourceTask{id=Procura_CDC-0} flushing 0 outstanding messages for offset commit (org.apache.kafka.connect.runtime.WorkerSourceTask:433)
kafka-connect-10 | [2020-01-23 23:37:12,918] INFO [Procura_CDC|task-0] WorkerSourceTask{id=Procura_CDC-0} Committing offsets (org.apache.kafka.connect.runtime.WorkerSourceTask:416)
kafka-connect-10 | [2020-01-23 23:37:12,930] INFO [Procura_CDC|task-0] WorkerSourceTask{id=Procura_CDC-0} flushing 0 outstanding messages for offset commit (org.apache.kafka.connect.runtime.WorkerSourceTask:433)
kafka-connect-10 | [2020-01-23 23:37:12,930] ERROR [Procura_CDC|task-0] WorkerSourceTask{id=Procura_CDC-0} Task threw an uncaught and unrecoverable exception (org.apache.kafka.connect.runtime.WorkerTask:179)
kafka-connect-10 | org.apache.kafka.common.errors.TimeoutException: Timeout expired while fetching topic metadata
kafka-connect-10 | [2020-01-23 23:37:12,931] ERROR [Procura_CDC|task-0] WorkerSourceTask{id=Procura_CDC-0} Task is being killed and will not recover until manually restarted (org.apache.kafka.connect.runtime.WorkerTask:180)
kafka-connect-10 | [2020-01-23 23:37:12,932] INFO [Procura_CDC|task-0] [Producer clientId=procura-dbhistory] Closing the Kafka producer with timeoutMillis = 9223372036854775807 ms. (org.apache.kafka.clients.producer.KafkaProducer:1183)
kafka-connect-10 | [2020-01-23 23:37:13,038] ERROR [Procura_CDC|task-0] Unable to unregister the MBean 'debezium.sql_server:type=connector-metrics,context=schema-history,server=procura' (io.debezium.relational.history.DatabaseHistoryMetrics:65)
kafka-connect-10 | [2020-01-23 23:37:13,039] INFO [Procura_CDC|task-0] [Producer clientId=connector-producer-Procura_CDC-0] Closing the Kafka producer with timeoutMillis = 30000 ms. (org.apache.kafka.clients.producer.KafkaProducer:1183)

Docker 文件:

version: '3'
services:

kafka-connect-02:
image: confluentinc/cp-kafka-connect:latest
container_name: kafka-connect-02
ports:
- 8083:8083
environment:
CONNECT_LOG4J_APPENDER_STDOUT_LAYOUT_CONVERSIONPATTERN: "[%d] %p %X{connector.context}%m (%c:%L)%n"
CONNECT_CUB_KAFKA_TIMEOUT: 300
CONNECT_BOOTSTRAP_SERVERS: "https://***9092"
CONNECT_REST_ADVERTISED_HOST_NAME: 'kafka-connect-02'
CONNECT_REST_PORT: 8083
CONNECT_GROUP_ID: _kafka-connect-group-01-v04
CONNECT_CONFIG_STORAGE_TOPIC: _kafka-connect-group-01-v04-configs
CONNECT_OFFSET_STORAGE_TOPIC: _kafka-connect-group-01-v04-offsets
CONNECT_STATUS_STORAGE_TOPIC: _kafka-connect-group-01-v04-status
CONNECT_KEY_CONVERTER: io.confluent.connect.avro.AvroConverter
CONNECT_KEY_CONVERTER_SCHEMA_REGISTRY_URL: "https://***9092"
CONNECT_KEY_CONVERTER_BASIC_AUTH_CREDENTIALS_SOURCE: "USER_INFO"
CONNECT_KEY_CONVERTER_SCHEMA_REGISTRY_BASIC_AUTH_USER_INFO: "***:***"
CONNECT_VALUE_CONVERTER: io.confluent.connect.avro.AvroConverter
CONNECT_VALUE_CONVERTER_SCHEMA_REGISTRY_URL: "https://***9092"
CONNECT_VALUE_CONVERTER_BASIC_AUTH_CREDENTIALS_SOURCE: "USER_INFO"
CONNECT_VALUE_CONVERTER_SCHEMA_REGISTRY_BASIC_AUTH_USER_INFO: "***:***"
CONNECT_INTERNAL_KEY_CONVERTER: 'org.apache.kafka.connect.json.JsonConverter'
CONNECT_INTERNAL_VALUE_CONVERTER: 'org.apache.kafka.connect.json.JsonConverter'
CONNECT_LOG4J_ROOT_LOGLEVEL: 'INFO'
CONNECT_LOG4J_LOGGERS: 'org.apache.kafka.connect.runtime.rest=WARN,org.reflections=ERROR'
CONNECT_CONFIG_STORAGE_REPLICATION_FACTOR: '3'
CONNECT_OFFSET_STORAGE_REPLICATION_FACTOR: '3'
CONNECT_STATUS_STORAGE_REPLICATION_FACTOR: '3'
CONNECT_PLUGIN_PATH: '/usr/share/java,/usr/share/confluent-hub-components/'
# Confluent Cloud config
CONNECT_REQUEST_TIMEOUT_MS: "20000"
CONNECT_RETRY_BACKOFF_MS: "500"
CONNECT_SSL_ENDPOINT_IDENTIFICATION_ALGORITHM: "https"
CONNECT_SASL_MECHANISM: "PLAIN"
CONNECT_SECURITY_PROTOCOL: "SASL_SSL"
CONNECT_SASL_JAAS_CONFIG: "org.apache.kafka.common.security.plain.PlainLoginModule required username=\"***\" password=\"**";"
#
CONNECT_CONSUMER_SECURITY_PROTOCOL: "SASL_SSL"
CONNECT_CONSUMER_SSL_ENDPOINT_IDENTIFICATION_ALGORITHM: "https"
CONNECT_CONSUMER_SASL_MECHANISM: "PLAIN"
CONNECT_CONSUMER_SASL_JAAS_CONFIG: "org.apache.kafka.common.security.plain.PlainLoginModule required username=\"***\" password=\"**";"
CONNECT_CONSUMER_REQUEST_TIMEOUT_MS: "20000"
CONNECT_CONSUMER_RETRY_BACKOFF_MS: "500"
#
CONNECT_PRODUCER_SECURITY_PROTOCOL: "SASL_SSL"
CONNECT_PRODUCER_SSL_ENDPOINT_IDENTIFICATION_ALGORITHM: "https"
CONNECT_PRODUCER_SASL_MECHANISM: "PLAIN"
CONNECT_PRODUCER_SASL_JAAS_CONFIG: "org.apache.kafka.common.security.plain.PlainLoginModule required username=\"***\" password=\"**";"
CONNECT_PRODUCER_REQUEST_TIMEOUT_MS: "20000"
CONNECT_PRODUCER_RETRY_BACKOFF_MS: "500"
# External secrets config
# See https://docs.confluent.io/current/connect/security.html#externalizing-secrets
CONNECT_CONFIG_PROVIDERS: 'file'
CONNECT_CONFIG_PROVIDERS_FILE_CLASS: 'org.apache.kafka.common.config.provider.FileConfigProvider'
command:
- bash
- -c
- |
echo "Installing connector plugins"
confluent-hub install --no-prompt debezium/debezium-connector-sqlserver:0.10.0
confluent-hub install --no-prompt snowflakeinc/snowflake-kafka-connector:0.5.5
#
echo "Launching Kafka Connect worker"
/etc/confluent/docker/run &

#
sleep infinity

Debezium 连接器:

 curl -i -X PUT -H "Content-Type:application/json" http://localhost:8083/connectors/Procura_CDC/config  -d '{  "connector.class":"io.debezium.connector.sqlserver.SqlServerConnector",
"tasks.max":"1",
"database.server.name":"***",
"database.hostname":"***",
"database.port":"***",
"database.user":"Kafka",
"database.password":"***",
"database.dbname":"Procura_Prod",
"database.history.kafka.bootstrap.servers":"*****",
"database.history.kafka.topic":"dbhistory.procura",
"table.whitelist":"dbo.CLIENTS,dbo.VISITS",
"poll.interval.ms":"2000",
"snapshot.fetch.size":"2000",
"snapshot.mode":"initial",
"snapshot.isolation.mode":"snapshot",
"transforms":"unwrap,dropPrefix",
"transforms.unwrap.type":"io.debezium.transforms.ExtractNewRecordState",
"transforms.unwrap.drop.tombstones":"false",
"transforms.unwrap.delete.handling.mode":"rewrite",
"transforms.dropPrefix.type":"org.apache.kafka.connect.transforms.RegexRouter",
"transforms.dropPrefix.regex":"procura.dbo.(.*)",
"transforms.dropPrefix.replacement":"$1" }'

谢谢

最佳答案

错误是说它无法连接到 Kafka。

以下不应该应用https://

  • CONNECT_BOOTSTRAP_SERVERS
  • CONNECT_CONSUMER_SSL_ENDPOINT_IDENTIFICATION_ALGORITHM
  • CONNECT_PRODUCER_SSL_ENDPOINT_IDENTIFICATION_ALGORITHM

事实上,甚至不确定最后两个配置是否有效。


您可能还想删除 REQUEST_TIMEOUT_MSRETRY_BACKOFF_MS 配置,除非您有特定的用例来设置这些配置


据我所知,不需要 CONFIG_PROVIDERS_FILE_CLASS,因为文件是默认实现。

提示:使用 .env 文件来减少 YAML 中所需的变量数量

关于apache-kafka - Debezium Kafka 连接错误 - TimeoutException : Timeout expired while fetching topic metadata,我们在Stack Overflow上找到一个类似的问题: https://stackoverflow.com/questions/59888430/

34 4 0
Copyright 2021 - 2024 cfsdn All Rights Reserved 蜀ICP备2022000587号
广告合作:1813099741@qq.com 6ren.com