We have a requirement to publish 4million records(regular data migration once in 2months) to confluent Kafka queue. if we send one by one json to producer it is taking 100k per hour. for four million records it would take a month or more.
Is there a way we can send bulk json messages(say 500 jsons) in One request.
any suggestions are appreciated.
This channel is for Kafka Streams (Kafka’s stream processing library), but your question is about the producer client. The Client channel would be a better place for your question.
But to answer anyway:
producer.send() does batch multiple records internally, and does send multiple records in a single request. There is a lot of different configs you can tune to get larger batch sizes.
Please checkout A Kafka Client’s Request: There and Back Again with Danica Fine that explains the relevant config that can help you to configure the producer for higher throughput.
This topic was automatically closed 7 days after the last reply. New replies are no longer allowed.