However we are seeing an error during serialization telling that the actual key is still a String. The error messages is as follows:
org.apache.kafka.streams.errors.StreamsException: ClassCastException while producing data to topic eventOutputJoin-KSTREAM-TOTABLE-0000000004-repartition. A serializer (key: io.confluent.kafka.streams.serdes.avro.GenericAvroSerializer / value: io.confluent.kafka.streams.serdes.avro.GenericAvroSerializer) is not compatible to the actual key or value type (key type: java.lang.String / value type: io.confluent.connect.avro.Key). Change the default Serdes in StreamConfig or provide correct Serdes via method parameters (for example if using the DSL, #to(String topic, Produced<K, V> produced) with Produced.keySerde(WindowedSerdes.timeWindowedSerdeFrom(String.class))).
How is this possible and how can we convert our String key to an Avro key?
There is a reason to use an avro key with a single field because our client has defined the schema registry like that and we are required to use it. We solved the issue though by using the following approach: