Hello, I’m building a new data pipeline that will move data from Postgres to snowflake with Kafka connect, I will have the Postgres source and a snowflake sink connectors. I want to insert data from around 50 tables from Postgres to 50 Tables on snowflakes using Docker file, Can you please give an Examples of how can I achieve this automatically.
Hello Robin, thanks for the Reply. I have around 50 Tables in the Postgres Database. I am guessing I will create a 1 topic to 1 table then how will I be able to pass all the 50 tables in a single Connector in a Curl Command?
Ex:
Looking at the documentation it seems you can use topics or topics.regex as described above, and then optionally a list of mapping pairs in snowflake.topic2table.map.
You could also use the RegExRouter Single Message Transform as part of the connector configuration to rename the topics if they all match a particular wildcard
Robin, Thanks for the suggestion on using regex, I am facing a weird Issue and seems like i am unable to find the Information any where.
When I create a New topic and add a table the records are no flowing but when i make a change on the postgres table after the topic is created like an Insert, Update or delete that’s when the data is flowing. Is there a setting I am missing? I want the Data to flow as soon as the Topic is created.