PostgreSQLSink Connector Issues

I am working on trying to get a PostgreSQL Sink connector setup. However the input data format options to not have JSON, or BYTES as an option. They only have AVRO, JSON_SR, and PROTOBUF.

Does anybody know if there is a roadmap for this preview connector on when the other formats will be available? Are there any workarounds to get this to work?

Or even possible an example on inserting some of these data formats using a C++ Producer.

What format is your data in? If it’s BYTES or JSON, where is the schema that should be used for building the target postgres table?

I realized there is a note saying that the Schema Registry needs to be “enabled”, but do we utilize the Topic Schema tab, or the Schema Registry? If we use Schema Registry is there a way to say this applies to a specific topic or do we have to use schema referencing?

How are you producing data to the topic?

When you write the data, whether from a Producer API or with Kafka Connect source connector, you can serialise it as Avro, Protobuf, or JSON Schema - and this puts the schema into the Schema Registry.

This blog might help: Kafka Connect Deep Dive – Converters and Serialization Explained | Confluent

We are using the C++ Producer API

It looks like there’s serdes provided that you can use to write the data as Avro.