I get the following error message during runtime, every time a Kafka event is generated in my production environment. Please note that the same code works in my local setup without any issues and all Jenkins builds are successful. The error is the following:
Caused by: o.a.k.c.c.ConfigException: Invalid value io.confluent.kafka.serializers.context.NullContextNameStrategy for configuration context.name.strategy: Class io.confluent.kafka.serializers.context.NullContextNameStrategy could not be found.\n"}
This is my Kafka config class:
@Bean
@Qualifier("json-schema")
public KafkaTemplate<String, MyEvent<?>> jsonSchemaKafkaTemplate() throws ClassNotFoundException {
return new KafkaTemplate<>(new DefaultKafkaProducerFactory<>(setCommonProducerProperties(),
new StringSerializer(),
new KafkaJsonSchemaSerializer<>()));
}
private Map<String, Object> setCommonProducerProperties(){
// Setting common properties
Map<String, Object> configProps = setBrokerProperties();
configProps.put(ProducerConfig.RETRIES_CONFIG, KafkaPropertiesConfiguration.JsonSchemaProperties.RETRIES);
configProps.put(ProducerConfig.CLIENT_ID_CONFIG,
return configProps;
}
private Map<String, Object> setBrokerProperties() {
Map<String, Object> brokerProperties = new HashMap<>();
brokerProperties.put(ProducerConfig.BOOTSTRAP_SERVERS_CONFIG, kafkaPropertiesConfiguration.getBroker().getBootstrapServers());
brokerProperties.put(SCHEMA_REGISTRY_URL_CONFIG, kafkaPropertiesConfiguration.getBroker().getSchemaRegistryUrl());
KafkaPropertiesConfiguration.SecurityProperties securityProperties = kafkaPropertiesConfiguration.getBroker().getSecurity();
(... Keystore and Truststore credentials)
return brokerProperties;
}
It is worth mentioning that some Kafka classes could not be found and loaded by the context loader during runtime, such as the “StringSerializer” class “KafkaJsonSchemaSerializer”, for this reason I have used the contstructor for both classes in my Kafka template. Otherwise I get the same error message that these classes could not be found.
The NullContextNameStrategy class should come from kafka-schema-serializer-7.6.0.jar, so double check that that jar is on the classpath in your production env.
I suspect that it will be there, and that this is due to the classloader loading a conflicting older version of that dependency. A brute force way to see if there is a conflict is to look for JARs containing a class that has been around for a long time, like RecordNameStrategy. For all directories on the classpath:
for jar in `ls *jar`; do jar tvf $jar | grep "io/confluent/kafka/serializers/subject/RecordNameStrategy.class" ; done;
See if a search like this turns up more than one result, and, if so, try to figure out where they’re each coming from (e.g., via mvn dependency:tree).
Finally, one thing that might be related since this feels like transitive dependency version conflict, is that I see here that spring-kafka3.1.x is compatible with kafka-clients3.6.0. I’m not sure if version 3.6.1 in your POM is problematic or “close enough” but it’s worth looking into.
Hello,
i have check my jar and indeed the kafka-schema-serializer-7.6.0.jar is present in the classpath along with all other related jars downloaded from https://packages.confluent.io/maven.
The command line that you have provided doesn’t deliver any result and also downgraded my kafka-clients to 3.6.0. Unfortunalty, i still get the same ConfigException.
This shouldn’t be the case – it should return exactly one result. E.g., jar tvf kafka-schema-serializer-7.6.0.jar should return RecordNameStrategy.class (and NullContextNameStrategy.class). Can you double check this?
The for loop command is intended to see if there is more than one result, which could also be problematic. Maybe, if you’re not getting any result, that jar is corrupted / empty in your production environment.
I’m facing the same problem with Salah. My code runs completely fine locally, but when I pack and run it in Docker then it always throws the below error when creating consumers
Caused by: o.a.k.c.c.ConfigException: Invalid value io.confluent.kafka.serializers.context.NullContextNameStrategy for configuration context.name.strategy: Class io.confluent.kafka.serializers.context.NullContextNameStrategy could not be found.\n"}
The confusing thing is that I can find and import that NullContextNameStrategy class, but that ConfigException still occurs :<
Hello,
I could not figure out why the Kafka dependencies could not be found when I ran it in Docker. But I managed to find a workaround for the problem so that the Kafka Producer could be started successfully using the schema registry. Using the constructors will most likely solve the problem of the class not being found during runtime.
You will find below a snippet of my Kafka Config File: