Number of Records in a single Kafka Message

HI,
I have a Kafka Message which contains JSON content and contains multiple rows(JSON documents) of the same structure to be ingested into a Snowflake Table. I need each row in the table to correspond to each JSON document within the Kafka Message.(which contains thousands of JSON documents)
I wanted to know there is any way to split the Kafka Message so that it will contain a single JSON document and it can then be ingested as a single row in Snowflake Table. Even better if I could flatten the JSON document contained in the Kafka message into individual columns in Snowflake table it would save me creating a Snowflake Procedure to process the JSON document and store it.
I am currently using Kafka Connect Snowflake Sink Connector for the same , however since the split and flattening is something I am not aware about if supported by the Sink Connector , I need to create intermediate staging table in Snowflake to hold the Kafka message and then run it through a Stored procedure to flatten and store into Snowflake table which is a lot of processing cost on Snowflake, so I was looking for a solution in Kafka Connect to achieve the same if possible?

Other Solution I thought of creating my custom process which can pick messages from Kafka and flatten them and use JDBC to insert/update into Snowflake table , however again I need to write my own logic and instead if the the Sink Connector can help in the same I can save on me maintaining the code for this seperate process and re-use an existing Snowflake Sink Connector for the same

Thanks,
Kevin

This topic was automatically closed 30 days after the last reply. New replies are no longer allowed.