Amazon SQS source connector ValueToKey;extractKey

Hi All,
I am looking for a way to extract a key from SQS source connector value record schema which is string. Can I use JSON PATH to achieve that ?

Should be possible, yes, but there’s no builtin transform for json path.

You could also use ksql, for example, to parse your string to do the same

Thank you for your response. I am working with confluent team to give me an example of using ksql.

You can find some examples at Working with nested JSON using ksqlDB

Thank you for the response. Does this mean I need to first load SQS to single partition topic (to maintain order) and then use ksqlDB to create a key for another topic with multiple partitions (for scaling) to consume from ?

Ksql is just one option, but yes, you’ll need consume, parse, then to write to another topic with a new key in order to scale the consumption.

Another option would be a Kafka producer inside a lambda triggered by sqs

So I think I can’t horizontally scale SQS Source connector even using ksqlDB.

Connector gives great advantage like at least once delivery and keeping orders using key. If I use lambda and write my own producer can I achieve at least once delivery at scale ?

cannot scale… even using ksql

You’d use ksql (or any other consumer, as mentioned) to write to a new, more distributed Kafka topic. Of course, you’d then lose ordering. Source connector doesn’t really matter.


connector gives great advantage like at least once delivery

I’m not completely familiar with SQS, but I suspect you’d get at most once delivery.

For example, if the lambda or connector consumed the event, but your producer failed to forward the record to Kafka, that data is gone.

But yes, the full event details are parsable, so you can set the keys and/or partition details to whatever you want. You could even send events to multiple topics at once.

Actually the connector guarantees at least once delivery and multiple tasks. Please see here. https://docs.confluent.io/kafka-connect-sqs/current/overview.html

This is hard to build if I use SQS Lambda triggers.

Before I jump on trying lambda, do we have a Header To Key SMT or custom SMT support in confluent cloud ? Also, can I use multiple SQS source connector tasks producing records to single partition topic ?

Yes, my bad, I see SQS documentation says it natively offers at least once delivery…

custom SMT support in confluent cloud ?

Not that I know of.

Header To Key SMT

https://jcustenborder.github.io/kafka-connect-documentation/projects/kafka-connect-transform-common/transformations/HeaderToField.html

use multiple SQS source connector tasks producing records to single partition topic

I cannot think of why that wouldn’t be possible. The connector just acts as a producer and can write from as many queues as needed into one or many Kafka topics.

hard to build if I use SQS Lambda triggers.

Don’t think so. You’d just import a relevant Kafka producer client and call send method, just like you would from any other app outside of lambda. Choose a language you’re most familiar with.

You mean to say I can use multiple tasks do you also mean I can use it source from single SQS queue to single partition topic ?

The HeaderToField$Key looks promising. Is it offered in confluent cloud ?

Thank you for your help.

The documentation says the connector supports multiple tasks.

That SMT is a custom one. I don’t use Confluent Cloud to know what is available or not.

Thank you for all you help!

This topic was automatically closed 30 days after the last reply. New replies are no longer allowed.