Upserting into multiple tables from multiples topics using kafka-connect

Hello, I’m building a new data pipeline that will move data from Postgres to snowflake with Kafka connect, I will have the Postgres source and a snowflake sink connectors. I want to insert data from around 50 tables from Postgres to 50 Tables on snowflakes using Docker file, Can you please give an Examples of how can I achieve this automatically.


Hi @Megna4,

Depending on which source connector you’re using there’ll be the option to specify a list of tables to include, either by name or a wildcard.

Similarly with the sink connector you should be able to specify topics.regex or a comma separated list of topics values.

Hello Robin, thanks for the Reply. I have around 50 Tables in the Postgres Database. I am guessing I will create a 1 topic to 1 table then how will I be able to pass all the 50 tables in a single Connector in a Curl Command?

curl -vvv -X POST -H "Content-Type: application/json" --data '{
                "": "sfkafkasftest.salesforce.account:ACCOUNT",

You should be able to do it all in one connector.

Looking at the documentation it seems you can use topics or topics.regex as described above, and then optionally a list of mapping pairs in

You could also use the RegExRouter Single Message Transform as part of the connector configuration to rename the topics if they all match a particular wildcard

Robin, Thanks for the suggestion on using regex, I am facing a weird Issue and seems like i am unable to find the Information any where.
When I create a New topic and add a table the records are no flowing but when i make a change on the postgres table after the topic is created like an Insert, Update or delete that’s when the data is flowing. Is there a setting I am missing? I want the Data to flow as soon as the Topic is created.