I am consuming Kafka events through a Consumer Service by implementing IConsumer interface. I will be getting lots of events (around in lakhs), in these many events I hardly need to consume 100’s only. So if I consume all the events and start reading the message, it will require additional effort to process. To avoid this, is there any option to filter the events by reading the message header and discard them if not required while consuming?
Did you find anything for this? I would like to know!
I think you should post this question in Clients - Confluent Community – Kafka Stream is for, well, Kafka Streams, Kafka’s Java stream processing library.
1 Like