Kafka to elasticsearch

I am trying to set up ingestion pipeline to elasticsearch cluster via kafka sink connector. A question I have is if I have a doc that haas multiple json objects like this:

{“name”:“abc”, “company”:“123”,“dept”:“test”},
{“name”:“def”, “company”:“456”,“dept”:“dev”},
{“name”:“ghi”, “company”:“654”,“dept”:“qa”},
{“name”:“jkl”, “company”:“567”,“dept”:“test”},
{“name”:“mno”, “company”:“786”,“dept”:“qa”}

would kafka connector be able to parse each of the log line and ingest as a separate document or would it ingest the entirety of the array as one doc?

The example in this just talks about one doc. So curious if a file has multiple json objects. If possible, what tweaks need to be done?

Please help!

This topic was automatically closed 30 days after the last reply. New replies are no longer allowed.