I experimented with connecting Confluent Kafka to Salesforce Data Cloud using the Confluent-native Sink connector. I set up a connected app and generated the necessary Java keystore files to enable the connector, which is now active.
The goal was to have the connector update a specific part of Salesforce when a message is sent to a certain Kafka topic. However, I noticed that the messages were ending up in a deadletter queue instead. I am able to update Salesforce object through API.
–data ‘{
“itemid__c”: “Test_AVI1”,
“orderid__c”: “12345”,
“ordertime__c”: “1692035115”
}’
I am currently looking into whether the problem is related to how the data fields are matched between Salesforce and Kafka. Has anyone else encountered a similar issue and found a solution?