Storing avro fields in XML in database

Hi,

This is regarding reading data on Kafka Communication Layer in Avro format, where there are different fields.
However, this data gets stored in database table through JDBC sink connector in same way i.e. in diferent fields.
Could you please let me know if there is a way to store these different fileds in single XML in database through JDBC sink connector.

@imk I’m not aware of an XML converter. The use case may require writing a custom converter. Here is documentation on converters: Kafka Connect Concepts | Confluent Documentation Here is a nice post about creating custom converters: Custom Converters :: Debezium Documentation

Hi, @imk!

There is this very simple SMT, which can convert the whole Kafka record payload to a single JSON or XML: GitHub - an0r0c/kafka-connect-transform-tojsonstring: transform-to-json-string is a Single Message Transformation (SMT) for Apache Kafka® Connect to convert a given Connect Record to a single JSON String. It's an UNOFFICIAL community project.

To have the payload in XML, you will only need to add something like this in your JDBC Sink Connector configuration:

...
"transforms": "ValueToJson",
"transforms.ValueToJson.type": "com.github.cedelsb.kafka.connect.smt.Record2JsonStringConverter$Value",
"transforms.ValueToJson.post.processing.to.xml": "true",
"transforms.ValueToJson.json.string.field.name": "XML_PAYLOAD",
...

Hope this helps.

1 Like

Thanks for the response.
Smt part is giving an error as invalid value for configuration transforms.ValueToJson.type.
Invalid value null for configuration.

Did you install the SMT itself in your cluster?

Yes, it is there on Kafka Communication layer, where messages are getting stored in Avro format.

Can you share your connector configuration and exact error message, which you see in the logs at least?