New to Kafka connect. Am trying to setup a Redis Sink Connector for a kafka topic to convert a AVRO schema topic into json values and store it in Redis.
The connector works without any errors but the data is un-usable.
Data Store in redis \x00\x00\x00\x00\x02\x18testUser while the expected data would be "{\"COL\":\"testUser\"}"
As per the documentation, Redis Sink Storage of bytes and strings if the data is not stored in string format, it mentions that the data will be stored in byte array and the data should be deserialized by the applications consuming it. A node js application would consume the data from the redis and am unable to deserialize the bytes data as well.
Am looking for the feasibility of storing the data into redis via sink and consuming directly from other applications.
@Vibun , can you post the configuration for the sink connector (without any secret data), and can you clarify your goal? Are you trying to figure out how to modify the node.JS application to read the Avro formatted bytes from Redis, or are you wanting to write the data into Redis as a string? The documentation you cite provides some details on using Single Message Transforms for converting the data to string prior to writing to Redis, does this help with your goal?
I tried with a ByteArrayConverter as well. But got the same result \x00\x00\x00\x00\x02\x18testUser
My goal is either
Write the data into redis as a string from the AVRO kafka topic. Tried to use SMT to convert the columns into string what that did not help and got the same result.
If you could let me know how to modify the node.js app to read the avro formatted bytes from redis that would also be helpful
I prefer the solution 1, as consuming the data from redis directly as string would be straight forward on the node application side.