JDBC sink connector fails immediately with no errors in log

I am trying to use the JDBC MySql sink connector to output a topic (AVRO format).
I installed the JDBC MS Sql driver in the folder with the connector jar.
When I follow these instructions

The connector immediately goes to a failed state and there are no errors in the connector log
Where can I look to find the error?

The Kafka Connect worker log will have the reason for the failure.
Are you using the docker-compose.yml from that link? docker-compose logs kafka-connect should provide the logs.

This is all that is logged

[2022-03-26 09:32:12,571] INFO Setting connector sink-jdbc-mysql-01 state to STARTED (org.apache.kafka.connect.runtime.Worker)
[2022-03-26 09:32:12,572] INFO SinkConnectorConfig values:
        config.action.reload = restart
        connector.class = io.confluent.connect.jdbc.JdbcSinkConnector
        errors.deadletterqueue.context.headers.enable = false
        errors.deadletterqueue.topic.name =
        errors.deadletterqueue.topic.replication.factor = 3
        errors.log.enable = false
        errors.log.include.messages = false
        errors.retry.delay.max.ms = 60000
        errors.retry.timeout = 0
        errors.tolerance = none
        header.converter = null
        key.converter = class org.apache.kafka.connect.storage.StringConverter
        name = sink-jdbc-mysql-01
        predicates = []
        tasks.max = 1
        topics = [test01]
        topics.regex =
        transforms = []
        value.converter = class io.confluent.connect.avro.AvroConverter
 (org.apache.kafka.connect.runtime.SinkConnectorConfig)
[2022-03-26 09:32:12,573] INFO EnrichedConnectorConfig values:
        config.action.reload = restart
        connector.class = io.confluent.connect.jdbc.JdbcSinkConnector
        errors.deadletterqueue.context.headers.enable = false
        errors.deadletterqueue.topic.name =
        errors.deadletterqueue.topic.replication.factor = 3
        errors.log.enable = false
        errors.log.include.messages = false
        errors.retry.delay.max.ms = 60000
        errors.retry.timeout = 0
        errors.tolerance = none
        header.converter = null
        key.converter = class org.apache.kafka.connect.storage.StringConverter
        name = sink-jdbc-mysql-01
        predicates = []
        tasks.max = 1
        topics = [test01]
        topics.regex =
        transforms = []
        value.converter = class io.confluent.connect.avro.AvroConverter
 (org.apache.kafka.connect.runtime.ConnectorConfig$EnrichedConnectorConfig)
[2022-03-26 09:32:12,573] INFO Setting task configurations for 1 workers. (io.confluent.connect.jdbc.JdbcSinkConnector)

I will try the docker yaml from the link. I am using the one from the confluent intaller page here Quick Start for Confluent Platform | Confluent Documentation

Is there a difference?

Thanks

Is there literally nothing else in the entire log?

They load different components, for example I don’t know if the quick start includes the JDBC connector.

Hi

No it didn’t contain the JDBC connector, so I installed it using

confluent-hub install confluentinc/kafka-connect-jdbc:latest

On the connect container. I then added the JDBC MySql driver jar file into the folder where the JAR for the JDBC connector installed. If I look at my available connectors it appears and lets me add my sink connector.

There are a lot of logs, but related to my Cosmos Source connector (which is working correctly). I will try with your Yaml file later and get back to you with full logs if it is still not working.

Thanks

1 Like