Error Registering Avro Schema - Serialization Exception

While we were researching ways we could load test our Kafka topics, we stumbled upon Voluble, which has worked beautifully for our initial testing. Although we are able to generate the messages into the topics, our luck has not been the same when it comes to trying to use the AVRO Converter available on Confluent-Hub.

Here’s what I’ve read and looked into, so far:

Error using AvroConverter · Issue #5 · MichaelDrogalis/voluble · GitHub - this was just the issue that was regarding using the AvroConverter, seemed to have been fixed.

Error registering Avro schema: Register schema operation failed while writing to the Kafka store; error code: 50001 · Issue #1176 · confluentinc/schema-registry · GitHub - When Googling the current error we are facing, I find this issues page, with a similar issue (50001 vs our 50005)

So all in all, the error I’m currently getting looks something like this:

Caused by: org.apache.kafka.common.errors.SerializationException: Error registering Avro schema[“null”,“string”]

Caused by: Error; error code: 50005






at io.confluent.kafka.schemaregistry.client.CachedSchemaRegistryClient.registerAndGetId(

at io.confluent.kafka.schemaregistry.client.CachedSchemaRegistryClient.register(

at io.confluent.kafka.schemaregistry.client.CachedSchemaRegistryClient.register(

at io.confluent.kafka.serializers.AbstractKafkaAvroSerializer.serializeImpl(

at io.confluent.connect.avro.AvroConverter$Serializer.serialize(

at io.confluent.connect.avro.AvroConverter.fromConnectData(


at org.apache.kafka.connect.runtime.WorkerSourceTask.lambda$convertTransformedRecord$2(

at org.apache.kafka.connect.runtime.errors.RetryWithToleranceOperator.execAndRetry(

at org.apache.kafka.connect.runtime.errors.RetryWithToleranceOperator.execAndHandleError(

at org.apache.kafka.connect.runtime.errors.RetryWithToleranceOperator.execute(

at org.apache.kafka.connect.runtime.WorkerSourceTask.convertTransformedRecord(

at org.apache.kafka.connect.runtime.WorkerSourceTask.sendRecords(

at org.apache.kafka.connect.runtime.WorkerSourceTask.execute(

at org.apache.kafka.connect.runtime.WorkerTask.doRun(


at java.base/java.util.concurrent.Executors$

at java.base/

at java.base/java.util.concurrent.ThreadPoolExecutor.runWorker(

at java.base/java.util.concurrent.ThreadPoolExecutor$

at java.base/

Right now, I’ve modified the file to have the key and value converter set to io.confluent.connect.avro.AvroConverter, and the schema registry url is set to our internal schemas list. The first area we think could potentially be causing issues are with the schema registry URL endpoints – does it have to be accessible via a curl, for instance?

Additionally, it seems as if the values we are setting within the properties file that holds the contents we are expecting to generate is not being converted. Within the properties file, we are just modifying the properties file to look something like this:





Genv.LegacyGpsV6.latitude.with = 0

From my understanding, we do not have to send it in as a JSON file or anything, it just seems as if we are using a properties file to send in these values. Before using the AVRO converter, this was publishing messages to Kafka just fine. After trying to use the AVRO Converter, this properties file does not work. Is our inability to use the Avro Converter a result of our incorrect formatting? Or is there something else that we are missing?

Any advice is appreciated!