I’m currently in a situation where I am not sure how to solve it.
I got two related tables (through foreign keys) in my old database which I want to stream to Kafka via JDBC Connectors.
When consuming this data from a topic I want to transform it to a new data structure / format and persist it into my new database table(s). To achieve this I though about a normal Kafka-Consumer on app layer instead of a JDBC Sink Connector.
But since the consumer needs the required data in a single set to do the given data transforming, I can’t use two separate source connectors, which would result in two separate topics.
A possible workaround for this problem would be to create a table view on my old db, which includes a join of these two tables and stream that instead of the tables itself. But that leads to another problem: Since I’m using the “id+ timestamp” mode I would only notice changes in the db view, when the table which provides the id+ updated_at columns changes.
Is there a good way to solve this problem? Is it maybe possible to stream these tables separately and join/merge them in another topic and consume that instead ? Are multiple id + timestamp columns possible ?
It’s really frustrating right now. Thank you very much !