Special character recieving from kakfa confluent

Hi All ,
am recieving special character in the begining of json message from kakfa confluent topic while consuming through @KafkaListener at springboot microservice level.
Because of special character in the begining of json message it is not auto Deserializing/mapping to pojo java class therefore am receving message into string then removing special character through java code & then mapping into java pojo class.

below are different error message while trying to solve this issue :

1> Unexpected character (‘%’ (code 37)): expected a valid value (JSON String, Number, Array, Object or tok

en ‘null’, ‘true’ or ‘false’)

2> exception while type casting message to Request : Illegal character ((CTRL-CHAR, code 0)): only regular white space (\r, \n, \t) is allowed between tokens

3> “\u0000\u0000\u0000\u0000l{

for Deserialzation am using below configuration ::

props.put(ConsumerConfig.KEY_DESERIALIZER_CLASS_CONFIG, StringDeserializer.class.getName());

1>props.put(ConsumerConfig.VALUE_DESERIALIZER_CLASS_CONFIG, StringDeserializer.class.getName());

2>props.put(“schema.registry.url”, “https://xxxxxxxxxx-central-1.aws.confluent.cloud”);

3>props.put(SaslConfigs.SASL_JAAS_CONFIG, “org.apache.kafka.common.security.plain.PlainLoginModule required username=xxxxxxxxx password=‘xxxxxxxxx’;”);
5>props.put(CommonClientConfigs.SECURITY_PROTOCOL_CONFIG, “SASL_SSL”);

i have uploaded pic for special character , highlighted in blue colour.

Any comments and suggestions will be appreciated. Thankss !!!


Your data isn’t a String. It appears to be encoded with a different serializer, such as Avro, perhaps, therefore is not JSON. Check the properties of your producer code before writing your consumer.

1 Like