we have around 50 backend applications which will send data to kafka topic. I need build a generic Kstreams application to process the data.
Data:
key - ID
Value - Json data( each application json data is different)
Processing:
Need to encrypt the id and hash the id. After hashing and encrypting we have to add these two objects to values.
Convert the values into AVRO format and send it into ADLS gen2
Question:
Is it possible to build single Kstream application to process all the 50 backend services data
Or we have to use individual Kstreams applications (In this case, is it possible to have a generic Kstreams application while deployment can we parameterize the topics )