Duplication of records in realtime streaming from MSSQL using debezium class

Hi Confluent Forum, We are implementing a realtime streaming from MS SQL Server using kafka source connector. The data from the source connector is being populated to a Kafka topic on Confluent Cloud. We are utilizing the debezium-debezium-connector-sqlserver-2.4.2 for the Confluent platform within our custom connector.

The configuration for the connection includes snapshot.isolation.mode set to “read_committed” in the config file, which unfortunately results in duplicate records. We have attempted using snapshot.isolation.mode as “exclusive” and “snapshot” to achieve consistency, but these attempts have not been successful.

Could anyone please provide guidance on the correct configuration or any potential solutions to prevent the duplication of records?

Thanks.

1 Like

This topic was automatically closed 30 days after the last reply. New replies are no longer allowed.