Hello,
We use a custom SMT that transforms an Avro message to XML with IBM MQ sink connector. Since we want to audit log the XML message generated by the SMT, the SMT also has code to produce the string of the XML payload along with other key fields as an Avro message to an audit logging topic. This was done to avoid customising the confluent provided IBM MQ sink connector to perform this additional step to produce to an audit logging topic.
When the SMT code was tested on local IDE (eclipse), there were no errors. However, once the SMT was deployed to the connect cluster, this is the error. Is there any limitation on the SMT or any setting to be taken care of once deployed to kafka-connect ? We have a self managed confluent kafka connect.
IllegalArgumentException occurred while publishing the data into audit Topic : (com.xxx.audit.event.writer.AuditTopicWriter:150)
java.lang.IllegalArgumentException: Unsupported Avro type. Supported types are null, Boolean, Integer, Long, Float, Double, String, byte and IndexedRecord
at io.confluent.kafka.schemaregistry.avro.AvroSchemaUtils.getSchema(AvroSchemaUtils.java:141)
at io.confluent.kafka.serializers.KafkaAvroSerializer.serialize(KafkaAvroSerializer.java:59)
at org.apache.kafka.common.serialization.Serializer.serialize(Serializer.java:62)
at org.apache.kafka.clients.producer.KafkaProducer.doSend(KafkaProducer.java:952)
…
So, you convert XML to Avro twice? You consume Avro (but Connect returns just a Struct, not Avro anymore). Then, you want an audit log of XML strings, but then you’re serializing an Avro payload to send? The error suggests that you should be using StringSerializer for XML strings…
You’ll need to share your code of how the Avro record is generated.
SMT do not have limitation of producing to Kafka. It’s just not recommended, since that’s not a “transform”, but that’s how dead letter queue feature works.
No, the generated XML is included in an audit log event (the XML string message is the payload of the event “payload”:“<>”) and produced to an audit log topic as generic record using Avro serialiser along with other details of source, target, key columns…etc per the audit log event schema
You’ll need to share your code of how the Avro record is generated.
Will do on Monday. Do not have access to it now
Thanks for confirming that SMTs do not have any limitations. I understand it is not recommended as the purpose of SMTs are to transform before producing.
The dead letter queue feature is for failed messages as I understand? In our case, we want to audit log every source event data and transformed event data that is processed by the pipeline.