Sending AVRO-encoded messages to SQS as JSON

Hello!

We have a topic that contains AVRO-encoded data. And we’re sending those messages to SQS using Nordtrom’s SQS adapter (GitHub - Nordstrom/kafka-connect-sqs: The SQS connector plugin provides the ability to use AWS SQS queues as both a source (from an SQS queue into a Kafka topic) or sink (out of a Kafka topic into an SQS queue).).

Connect worker config is pretty straightforward

{
  "connector.class": "com.nordstrom.kafka.connect.sqs.SqsSinkConnector",
  "key.converter": "org.apache.kafka.connect.storage.StringConverter",
  "value.converter": "io.confluent.connect.avro.AvroConverter",
  "sqs.wait.time.seconds": "5",
  "topics": "uachieve-etl.etl_completed",
  "name": "sqs-sink-test",
  "sqs.max.messages": "5",
  "sqs.queue.url": "redacted",
  "value.converter.schema.registry.url": "http://schema-registry:8080"
}

Once decoded, the data in the topic is just a small JSON string

{
"record_id":"5702000",
"record_type":"student",
"job":"1072968000226BST",
"transaction":"db0c611d-8167-45ef-b23b-60078d691d4d",
"institution":"004069",
"queue":"what_if",
"event":"etl.completed",
"successful":true,
"created_at":1651073035119812
}

But it’s ending up in SQS as a Struct

Struct{
record_id=1862076,
record_type=student,
job=0959634000244BST,
transaction=1285cef9-4e78-4c77-b921-b6952ed09111,
institution=003969,
queue=what_if,
event=etl.completed,
successful=true,
created_at=1650959648825711
}

Any way I can just send SQS the JSON string version instead of this Struct?

IMO, you’d be better off asking your question to that developer - Issues · Nordstrom/kafka-connect-sqs · GitHub

But last I checked, this connector only supports string or bytearray converters, as it always will toString() the Connect Struct records. You would therefore need to write/use some Avro to JSON/string message transform.

You could also use Kafka Streams/KSQL to consume your Avro topic to a JSON topic, then have Connect read from that.