Debezium mysql task failed

Started the docker container (cp-kafka-connect-base:7.0.1) and installed the self-managed connector (debezium-connector-mysql:1.7.1).

Docker container started fine.

When I am trying to configure the connector configuration as below

  "name": "family-mysql-connector",
  "config": {
    "connector.class": "io.debezium.connector.mysql.MySqlConnector",
    "database.hostname": "3es-mysql",
    "database.port": "3306",
    "database.user": "cconnect",
    "database.password": "xxxxxxx",
    "": "5500",
    "": "dev",
    "database.include.list": "family",
    "database.history.kafka.bootstrap.servers": "",
    "database.history.kafka.topic": "" ,
    "include.schema.changes": "true",
    "tasks.max": "1"

Container logs

[2022-02-28 06:09:52,608] WARN [family-mysql-connector|task-0] [Consumer clientId=dev-dbhistory, groupId=dev-dbhistory] Bootstrap broker (id: -1 rack: null) disconnected (org.apache.kafka.clients.NetworkClient:1050)

[2022-02-28 06:10:10,849] INFO [family-mysql-connector|task-0|offsets] WorkerSourceTask{id=family-mysql-connector-0} Either no records were produced by the task since the last offset commit, or every record has been filtered out by a transformation or dropped due to transformation or conversion errors. (org.apache.kafka.connect.runtime.WorkerSourceTask:484)

[2022-02-28 06:10:52,539] ERROR [family-mysql-connector|task-0] WorkerSourceTask{id=family-mysql-connector-0} Task threw an uncaught and unrecoverable exception. Task is being killed and will not recover until manually restarted (org.apache.kafka.connect.runtime.WorkerTask:195)
org.apache.kafka.common.errors.TimeoutException: Timeout expired while fetching topic metadata

When I trying to check the status:


"name" : “family-mysql-connector” ,

"connector" : {

"state" : “RUNNING” ,

"worker_id" : “3es-cconnect-mysql:8083”


"tasks" : [


"id" : 0 ,

"state" : “FAILED” ,

"worker_id" : “3es-cconnect-mysql:8083” ,

"trace" : "org.apache.kafka.common.errors.TimeoutException: Timeout expired while fetching topic metadata\n"



"type" : “source”


Someone help me to find out the root cause?

I found myself for debugging and fixed.

Fix: Adding the below entries in my connector

database.history.kafka.topic: xx,
database.history.kafka.bootstrap.servers: xxx, SASL_SSL,
database.history.consumer.ssl.endpoint.identification.algorithm: https,
database.history.consumer.sasl.mechanism: PLAIN,
database.history.consumer.sasl.jaas.config: required username=\xx\ password=\xx/vF+xx/x+x;, SASL_SSL,
database.history.producer.ssl.endpoint.identification.algorithm: https,
database.history.producer.sasl.mechanism: PLAIN,
database.history.producer.sasl.jaas.config: required username=\632HZGOZTJVRTFNU\ password=\xx/vF+xx/x+x;,

This topic was automatically closed 7 days after the last reply. New replies are no longer allowed.