If you manually construct a Confluent Serializer/Deserilizer/Serde that leverages Schema Registry, when you create one, don’t do this:
HashMap<String, Object> map = new HashMap<>();
map.put(KafkaAvroDeserializerConfig.SPECIFIC_AVRO_READER_CONFIG, "true");
map.put(AbstractKafkaSchemaSerDeConfig.SCHEMA_REGISTRY_URL_CONFIG, properties.getSchemaRegistryUrl());
Serde<User> serde = new SpecificAvroSerde<>();
serde.configure(map, false);
This works fine for your local non-secured local (docker) environment. However, let’s say you promote this code to a higher level environment that uses basic authentication to lock down the schema registry. It won’t work, even if you added your authentication properties correctly to your Kafka client properties.
The following properties must be provided to the serializer and deserializer:
basic.auth.credentials.source=USER_INFO
basic.auth.user.info=user:password
schema.registry.ssl.truststore.location="/secrets/truststore.jks"
schema.registry.ssl.truststore.password=myAwesomePassword
So do this instead, and if you have to create a serde that is different than the settings in your typical client configuration, override that one setting, as in:
Map<String, Object> map =new HashMap<>(kafkaConfig.properties());
map.put(KafkaAvroDeserializerConfig.SPECIFIC_AVRO_READER_CONFIG, "true");
Serde<User> serde = new SpecificAvroSerde<>();
serde.configure(map, false);