At the moment I have a Kafka cluster that uses Kafka connect with a sink-connector in order to save Avro messages in a MongoDB. I want to use Kafka Streams to process (check if they contain Metadata for example) these messages before saving them through the sink-connector in the MongoDB. What is the best way to do this (using .NET)?
never did this by myself
though did you check the following examples
I guess they might be helpful
I would have to change my question. After some research I have understood that I dont need Kafka Streams for this. I just want to filter all incoming messages and only let the ones through that contain certain metadata. This should be possible within Kafka connect. I saw that Confluent had a filter for this but I have not seen the one I need for Apache Kafka… Do you know what the best way is in order to achieve this?
ok I see
incoming message are coming from which source?
Kafka Streams is a Java library. Thus, while it’s a good fit for the problem, it’s not .NET. Connect is a framework to import/export data from/to external systems – so it might not be a good fit either.
If you want to use .NET, you could use plain .NET consumer/producer. For this, I would recommend you ask for help in Clients - Confluent Community.
This topic was automatically closed 7 days after the last reply. New replies are no longer allowed.