Manual Construction of Serializer/Deserializer considerations

If you manually construct a Confluent Serializer/Deserilizer/Serde that leverages Schema Registry, when you create one, don’t do this:

HashMap<String, Object> map = new HashMap<>();
map.put(KafkaAvroDeserializerConfig.SPECIFIC_AVRO_READER_CONFIG, "true");
map.put(AbstractKafkaSchemaSerDeConfig.SCHEMA_REGISTRY_URL_CONFIG, properties.getSchemaRegistryUrl());
Serde<User> serde = new SpecificAvroSerde<>();
serde.configure(map, false);

This works fine for your local non-secured local (docker) environment. However, let’s say you promote this code to a higher level environment that uses basic authentication to lock down the schema registry. It won’t work, even if you added your authentication properties correctly to your Kafka client properties.

The following properties must be provided to the serializer and deserializer:

basic.auth.credentials.source=USER_INFO
basic.auth.user.info=user:password
schema.registry.ssl.truststore.location="/secrets/truststore.jks"
schema.registry.ssl.truststore.password=myAwesomePassword

So do this instead, and if you have to create a serde that is different than the settings in your typical client configuration, override that one setting, as in:

Map<String, Object> map =new HashMap<>(kafkaConfig.properties());
map.put(KafkaAvroDeserializerConfig.SPECIFIC_AVRO_READER_CONFIG, "true");
Serde<User> serde = new SpecificAvroSerde<>();
serde.configure(map, false);
2 Likes

Where is the schema registry url coming from in the second example? But true, they might need some additional config form the client config.

Also specific only works with the correct classes on the classpath, first thought it was about that.

In my example, the schema registry URL is part of the kafkaConfig.properties(). Kafka Clients will ignore the extra parameter (but does log them). Yes, specific Avro requires POJOs for each of the objects being serialized/deserialized.