JDBC sink using windowed topic with key


I am trying to stream the changes from a windowed topic to a Mysql table using Kafka connect and the io.confluent.connect.jdbc.JdbcSinkConnector . The key is written to the topic with WindowedSerdes.timeWindowedSerdeFrom(Long.class, Duration.ofHours(1L)). My intention is to have the values from the key as a composite primary key for the generated table (the Long represents an id and then the startTime and endTime of the window should be long values as well). Since using the JdbcSinkConnector requires a schema, what would be the schema for a Windowed key? I tried using a SpecificAvroSerializer of type String but it needs to be a type implementing SpecificRecord. Is there a way to do it from this topic or do I need to create a new stream, extracting the key, startTime and endTime into the value of the record and then just specify the record_value as the pk.mode?

Thank you.

Windowed Keys used by KafkaStreams and ksqlDB are built with serdes as part of the kafka Streams application, org.apache.kafka.streams.kstream.WindowedSerdes. You would need to add/replicate them into a stand-alone jar where you build a Customer Converter and then reference that…

Typically, when I need this, I will do a rekey in my streams application and created a key that is not windowed (or create a json structured key with the windowing defined). This way I can use stanard components within the Connect ecosytem. If you are worried about adding storage for another topic, the TTL of messages in this topic could be shorter.

Thank you for your reply. So I guess the short answer is that Kafka streams and ksqlDB do not come with a pre-defined Converter for the windowed Serdes.

Related to your solution, do you see a difference between it and what I proposed? (Creating a new topic where the key properties from the windowed one are set as topic value fields and then use the existing infrastructure to extract JDBC primary keys from topic values versus your solution to rekey intro a structured key (which is not windowed) and use the existing infrastructure to extract JDBC primary keys from the topic key). Since this is already an “end” to the processing and no further topic operations will be done, do you see any problems in terms of performance and usage?

Thank you.

This topic was automatically closed 30 days after the last reply. New replies are no longer allowed.