Kafka Connect and nested JSON

Does kafka connectors (S3 sink) support nested json?

So you’ve got JSON on your Kafka topic, and you want to write JSON to S3? Nested or not this should work just fine with format.class=io.confluent.connect.s3.format.bytearray.ByteArrayFormat

Ref: https://docs.confluent.io/kafka-connect-s3-sink/current/index.html#s3-object-formats

Probably my question will not fully correlate with the original message, but I think it fits the headline.

What if we talk about JDBC Sink Connector and we need to write our Nested-JSON messages to a database table?

In one of your blog posts, @rmoff , described here, you already covered the flatten SMT functionality.

But what if we have not a nested object (or STRUCT), but an array (is it MAP)? Is it possible to somehow flatten this data and write it to a database table?

I would assume, that for such arrays, we would need a separate related table in our database. I only wonder if this is possible to be done via only JDBC Sink Connector configuration.

You’d need to do this in Kafka Streams or ksqlDB beforehand. For example, in ksqlDB you can use the EXPLODE function.

1 Like

Thanks, @rmoff.

I will definitely give it a try, but unfortunately I first need to convince our management to start to use ksqlDB, currently we don’t have it :frowning:

We found though this fork of the JDBC Connector: JDBC Sink with Flatten Feature | Confluent Hub (by the way, the “Source Code” link on that page is not correct).

Do you probably have any experience with it?

I’ve not tried that one, no. Thanks for reporting the broken link, I’ll try and get it fixed.

If not ksqlDB then you can use Kafka Streams instead :slight_smile:

So this is not really possible to map such array structures manually, i.e. to a CLOB attribute in the database table for further post-processing there, or is it?

This topic was automatically closed 14 days after the last reply. New replies are no longer allowed.