Event stream from DBs with checks using KSqlDB

I have just come across Using Kafka to Discover Events Hidden in your Database but I already implemented the procedure suggested by @rmoff locally.

Decided to using JDBC query log in a system where the DBs are the primary storage and kafka is an addendum. Hope it will work. But I have to discuss some aspects.

  1. Want to use Kafka Connect to sink into Elastic Search. But the there is also a consumer that sends the same message to a device. The kafka connect consumer group and the message sender’s group are the same ?
  2. KSql can help me create a layer to stop the the two receivers mentioned above. I can join the message in the topic with a DB with configuration data. This creates a persistent topic. Is that right ? I don’t want to use complicated SQL in the source connector to do this. Can I create this KSql topic automatically when I deploy.
  3. This is the same as question 1. Is there another pattern ? Can I use a Spring Kafka Stream project to interject, post the message to a device and let a copy sink into Elastic Search.

Does any of this need Kafka Streams ? We don’t have an endless stream from IOT etc. Don’t aggregate anything.
Adding a question. Do I need a schema registry ? I have a simple schema.