Hi,
I am working with Debezium cdc Connector.
Below is my code,
name=mes_po-info123
connector.class= io.debezium.connector.postgresql.PostgresConnector
tasks.max= 1
plugin.name= pgoutput
database.hostname= XXXXXX
database.port= 5432
database.user= XXXXX
database.password= XXXXX
database.dbname= XXXXX
database.server.name= XXXXXX
table.include.list= Chk.mes_po_info
include.schema.changes=true
database.history.kafka.bootstrap.servers= 127.0.0.1:9092
topic.prefix=chk1234
slot.name=110585
key.converter=org.apache.kafka.connect.json.JsonConverter
key.converter.schemas.enable=false
value.converter=org.apache.kafka.connect.json.JsonConverter
value.converter.schemas.enable=false
I am not sure, How can we declare or write Postgres Table Primary Key values in debezium cdc…
whenever I am not using DB constraints, I can see Data was pushing into topic and committed offset as below.
[2024-10-16 17:59:43,206] INFO [vjn_mes_insp|task-0|offsets] WorkerSourceTask{id=vjn_mes_insp-0} Committing offsets for 1 acknowledged messages (org.apache.kafka.connect.runtime.WorkerSourceTask:235)
while using DB Constraints its not committing offset, its retriving all rows (like jdbc source connector bulk) but later, it seems replication slots are skipping.
getting below error and not moving further.
[2024-10-16 19:20:02,817] INFO [mes_po-info123|task-0] Searching for WAL resume position (io.debezium.connector.postgresql.PostgresStreamingChangeEventSource:342)