Push multiple tables in one topic and use Sink to write back to multiple tables


I have a set of tables that represent a logical entity in my Postgres DB.
They’re related to each other via FK constraints.

I want to use the Debezium connector to push data change events for all the tables, and rebuild the same structure in a destination DB.

If I use the default approach, one topic per table will be created. Sink connectors will then consume the messages separately from each topic, so the correct order of DB operations cannot be guaranteed.

So, I thought I could push all the data change events to one single topic… but then I don’t know how to deal with this situation with Sink connectors.

Do you have an idea of some sink connector capable to understand the table to write to depending on the payload of the message?

It sounds like the Outbox pattern would be useful for you here. You’d need a way to unpack the message from the Kafka topic back out into separate tables for the destination but you could achieve that with ksqlDB or Kafka Streams I think.

Like @rmoff said the outbox pattern is probably what you’re after.

If you still want to have all the database table messages into a single topic, you can route the messages to other topics after you’d produced them to the initial topic. This will require you to write a processor in something like Kafka Streams.