Can we use generic serdes for serialization and deserialization

we have around 50 backend applications which will send data to kafka topic. I need build a generic Kstreams application to process the data.

Data:
key - ID
Value - Json data( each application json data is different)

Processing:

  1. Need to encrypt the id and hash the id. After hashing and encrypting we have to add these two objects to values.
  2. Convert the values into AVRO format and send it into ADLS gen2

Question:
Is it possible to build single Kstream application to process all the 50 backend services data
Or we have to use individual Kstreams applications (In this case, is it possible to have a generic Kstreams application while deployment can we parameterize the topics )

It should be possible to build a single application that consumers all input topics, given that the data format is the same in all topics.

Overall, it sounds like a “simple” program:

builder.stream(/* list of input topics, or pattern */)
       .map(...)
       .to("output-topic");

assuming you have a single output topic? The map() step should allows you to transform the data in a single step the way you want.

For each source topic there is a corresponding sink topic.
eg: sourceAtopic ----> Kstreams. —> sinkAtopic
sourceBtopic —> Kstreams —> sinkBtopic

Kstreams opration : Encrypting user id (Key of the stream)

If you have different sink topics, you can use a to(TopicNameExtractor) to route to different output topic, instead of passing in a single topic name.

If TopicNameExtractor does not work for some reason, you could also create a single pipeline for each input topic (either hard-coded or in some loop):

for (/*step through all input/output topic pairs*/) {
  builder.stream(<single-input-topic>)
       .map(...)
       .to(<corresponding-output-topic>);.
}