Error Registering Avro Schema - Serialization Exception

While we were researching ways we could load test our Kafka topics, we stumbled upon Voluble, which has worked beautifully for our initial testing. Although we are able to generate the messages into the topics, our luck has not been the same when it comes to trying to use the AVRO Converter available on Confluent-Hub.

Here’s what I’ve read and looked into, so far:

Error using AvroConverter · Issue #5 · MichaelDrogalis/voluble · GitHub - this was just the issue that was regarding using the AvroConverter, seemed to have been fixed.

Error registering Avro schema: Register schema operation failed while writing to the Kafka store; error code: 50001 · Issue #1176 · confluentinc/schema-registry · GitHub - When Googling the current error we are facing, I find this issues page, with a similar issue (50001 vs our 50005)

So all in all, the error I’m currently getting looks something like this:

Caused by: org.apache.kafka.common.errors.SerializationException: Error registering Avro schema[“null”,“string”]

Caused by: io.confluent.kafka.schemaregistry.client.rest.exceptions.RestClientException: Error; error code: 50005

at io.confluent.kafka.schemaregistry.client.rest.RestService.sendHttpRequest(RestService.java:295)

at io.confluent.kafka.schemaregistry.client.rest.RestService.httpRequest(RestService.java:365)

at io.confluent.kafka.schemaregistry.client.rest.RestService.registerSchema(RestService.java:508)

at io.confluent.kafka.schemaregistry.client.rest.RestService.registerSchema(RestService.java:499)

at io.confluent.kafka.schemaregistry.client.rest.RestService.registerSchema(RestService.java:472)

at io.confluent.kafka.schemaregistry.client.CachedSchemaRegistryClient.registerAndGetId(CachedSchemaRegistryClient.java:213)

at io.confluent.kafka.schemaregistry.client.CachedSchemaRegistryClient.register(CachedSchemaRegistryClient.java:275)

at io.confluent.kafka.schemaregistry.client.CachedSchemaRegistryClient.register(CachedSchemaRegistryClient.java:251)

at io.confluent.kafka.serializers.AbstractKafkaAvroSerializer.serializeImpl(AbstractKafkaAvroSerializer.java:103)

at io.confluent.connect.avro.AvroConverter$Serializer.serialize(AvroConverter.java:153)

at io.confluent.connect.avro.AvroConverter.fromConnectData(AvroConverter.java:86)

at org.apache.kafka.connect.storage.Converter.fromConnectData(Converter.java:63)

at org.apache.kafka.connect.runtime.WorkerSourceTask.lambda$convertTransformedRecord$2(WorkerSourceTask.java:312)

at org.apache.kafka.connect.runtime.errors.RetryWithToleranceOperator.execAndRetry(RetryWithToleranceOperator.java:156)

at org.apache.kafka.connect.runtime.errors.RetryWithToleranceOperator.execAndHandleError(RetryWithToleranceOperator.java:190)

at org.apache.kafka.connect.runtime.errors.RetryWithToleranceOperator.execute(RetryWithToleranceOperator.java:132)

at org.apache.kafka.connect.runtime.WorkerSourceTask.convertTransformedRecord(WorkerSourceTask.java:312)

at org.apache.kafka.connect.runtime.WorkerSourceTask.sendRecords(WorkerSourceTask.java:341)

at org.apache.kafka.connect.runtime.WorkerSourceTask.execute(WorkerSourceTask.java:261)

at org.apache.kafka.connect.runtime.WorkerTask.doRun(WorkerTask.java:188)

at org.apache.kafka.connect.runtime.WorkerTask.run(WorkerTask.java:237)

at java.base/java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:515)

at java.base/java.util.concurrent.FutureTask.run(FutureTask.java:264)

at java.base/java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1128)

at java.base/java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:628)

at java.base/java.lang.Thread.run(Thread.java:829)

Right now, I’ve modified the connect-standalone.properties file to have the key and value converter set to io.confluent.connect.avro.AvroConverter, and the schema registry url is set to our internal schemas list. The first area we think could potentially be causing issues are with the schema registry URL endpoints – does it have to be accessible via a curl, for instance?

Additionally, it seems as if the values we are setting within the properties file that holds the contents we are expecting to generate is not being converted. Within the properties file, we are just modifying the properties file to look something like this:

Name=AVROTEST

Connector.class=io.mdrogalis.voluble.VolubleSourceConnector

Tasks.max=20

Genkp.LegacyGpsV6.with=#{Internet.uuid}

Genv.LegacyGpsV6.latitude.with = 0

From my understanding, we do not have to send it in as a JSON file or anything, it just seems as if we are using a properties file to send in these values. Before using the AVRO converter, this was publishing messages to Kafka just fine. After trying to use the AVRO Converter, this properties file does not work. Is our inability to use the Avro Converter a result of our incorrect formatting? Or is there something else that we are missing?

Any advice is appreciated!