KTable - how to publish it to Kafka?

Hello frens,
what is the best manner to publish KTable to Kafka?
Assume thath I’m streaming crypto currency trades and aggregating the prices to mean in KTable. Then I want to publish the KTable to Kafka. How to do this?

Should I map it to stream?

myKTable.toStream().to(myTopic);

or maybe its better to materialize it in the aggregation?

stream.groupByKey()
                .windowedBy(TimeWindows.of(duration))
                .aggregate(
                        () -> null,
                        new MeanPriceAggregator(duration),
                        Materialized.<String, MeanPrice, KeyValueStore<String, byte[]>>as(myTopic)
                                .with(stringSerde, meanPriceSerde))

If you want to publish data to Kafka, you should do myKTable.toStream().to(myTopic);, to write the table’s changelog stream into a topic. Note, that you cannot publish the “table result” (ie, no snapshot) to Kafka, because the result of the table is continuously updated when new input arrives, and thus you can “only” publish the changelog, because there is no such thing as a “final result”.

The term “materialize” refers to client side state stores, and for a stream-aggregation the result is materialized into a state store anyway (even without passing in Materialized parameter). Also note that, Materialized.as(...) does not take a topic name, but a state store name as parameter. This store name allows you to query the state store via “interactive queries” feature.

1 Like