We are exploring an issue with RDS postgres as a source connector and mssql sink connector, we tried both json and non-json format but without luck, any help/hint would be highly appreciated.
Source postgres rds connector:
{
“name”: “inventory-connector”,
“config”: {
“tasks.max”: “1”,
“connector.class”: “io.debezium.connector.postgresql.PostgresConnector”,
“plugin.name”: “pgoutput”,
“snapshot.mode”: “always”,
“topic.prefix”: “nonp”,
“database.hostname”: “",
“database.port”: “5432”,
“database.user”: "",
“database.password”: "",
“heartbeat.interval.ms”: “1500”,
“database.dbname”: "",
“table.include.list”: “public.bitestevents”,
“key.converter”: “org.apache.kafka.connect.json.JsonConverter”,
“value.converter”: “org.apache.kafka.connect.json.JsonConverter”,
“key.converter.schemas.enable”: “false”,
“value.converter.schemas.enable”: “false”,
“schema.include”: “public”,
“publication.autocreate.mode”: “filtered”,
“database.server.name”: "***”
}
}
sink mssql connector:
{
“name”: “sqlsink-connector”,
“config”: {
“connector.class”: “io.confluent.connect.jdbc.JdbcSinkConnector”,
“tasks.max”: “1”,
“topics”: “",
“connection.url”: "",
“connection.user”: "",
“connection.password”: "",
“table.name.format”: "**”,
“heartbeat.interval.ms”: “1500”,
“insert.mode”: “upsert”,
“value.converter”: “org.apache.kafka.connect.json.JsonConverter”,
“key.converter.schemas.enable”: “false”,
“value.converter.schemas.enable”: “false”,
“pk.mode”: “record_value”,
“pk.fields”: “id”,
“key.converter”: “org.apache.kafka.connect.json.JsonConverter”,
“auto.create”: “true”
}
}
Error on Kafka-connect:
2023-01-20 14:12:14,098 ERROR || WorkerSinkTask{id=sqlsink-connector-0} Task threw an uncaught and unrecoverable exception. Task is being killed and will not recover until manually restarted. Error: null [org.apache.kafka.connect.runtime.WorkerSinkTask]
java.lang.NullPointerException