[Please help. Total noob but dont undertstand where im going wrong.
I am having issue Steaming from SQL Server ro SQL Server.
The Debezium source is working perfectly but when trying to write to SQL Server with JDBC whatever I try It fail. At one point it did create the id in a new table in a new DB then died again.
#source config#
{
“name”: “sourceconn1”,
“config”: {
“connector.class” : “io.debezium.connector.sqlserver.SqlServerConnector”,
“tasks.max” : “1”,
“database.hostname” : “df-dev-mssql”,
“database.port” : “1433”,
“database.user” : “sa”,
“database.password” : “P@ssw0rd1999”,
“database.dbname” : “T014_rel-colotesting”,
“table.include.list”: “cas.Cases”,
“database.server.name” : “test7”,
“database.history.kafka.bootstrap.servers” : “localhost:9092”,
“database.history.kafka.topic”: “schema-changes7”
}
}
#sink config#
{
“name”: “sinkconn1”,
“config”:{
“connector.class”: “io.confluent.connect.jdbc.JdbcSinkConnector”,
“tasks.max”: “1”,
“connection.url”: “jdbc:sqlserver://df-dev-mssql:1433”,
“database”:“Sink-DB”,
“connection.user”: “sa”,
“connection.password”: “P@ssw0rd1999”,
“topics”: “test7”,
“delete.enabled”:“true”,
“pk.mode”:“record_key”,
“pk.field”:“id”,
“auto.create”: “true”
}
}
#ditributited props#
This file contains some of the configurations for the Kafka Connect distributed worker. This file is intended
to be used with the examples, and some settings may differ from those used in a production system, especially
the bootstrap.servers
and those specifying replication factors.
A list of host/port pairs to use for establishing the initial connection to the Kafka cluster.
bootstrap.servers=localhost:9092
unique name for the cluster, used in forming the Connect cluster group. Note that this must not conflict with consumer group IDs
group.id=connect-cluster
The converters specify the format of data in Kafka and how to translate it into Connect data. Every Connect user will
need to configure these based on the format they want their data in when loaded from or stored into Kafka
key.converter=org.apache.kafka.connect.json.JsonConverter
value.converter=org.apache.kafka.connect.json.JsonConverter
Converter-specific settings can be passed in by prefixing the Converter’s setting with the converter we want to apply
it to
key.converter.schemas.enable=false
value.converter.schemas.enable=false
plugin.path=/home/kafka/kafka/plugins
Source is pure SQL CDC