Schema(Avro) config question

Hey there,

i will need some support on this.

Running connect in standalone mode and mysql-sink with worker.properties like this:

bootstrap.servers=some.url:9093
security.protocol=SSL
ssl.protocol=TLSv1.2
ssl.truststore.location=/opt/kafka/ssl/myTruststore1.jks
ssl.truststore.password=xxx
ssl.keystore.location=ssl/myKeystore.jks
ssl.keystore.password=xxx
ssl.key.password=xxx
value.converter=io.confluent.connect.avro.AvroConverter
value.converter.schema.registry.url=https://schema-reg.some.url/subjects/versions/latest/schema
key.converter=io.confluent.connect.avro.AvroConverter
key.converter.schema.registry.url=https://schema-reg.some.url/subjects/versions/latest/schema
key.converter.schemas.enable=true
value.converter.schemas.enable=true
offset.storage.file.filename=/tmp/connect.offsets
plugin.path=/usr/local/share/java,/usr/local/share/kafka/plugins,/opt/connectors,

If i start kafka connect worker with mysql-sink i will not be able to receive data or write data into db:

[2021-12-23 11:30:12,694] INFO [mysql-sink|task-2] AvroConverterConfig values:
        auto.register.schemas = true
        basic.auth.credentials.source = URL
        basic.auth.user.info = [hidden]
        bearer.auth.credentials.source = STATIC_TOKEN
        bearer.auth.token = [hidden]
        context.name.strategy = class io.confluent.kafka.serializers.context.NullContextNameStrategy
        id.compatibility.strict = true
        key.subject.name.strategy = class io.confluent.kafka.serializers.subject.TopicNameStrategy
        latest.compatibility.strict = true
        max.schemas.per.subject = 1000
        normalize.schemas = false
        proxy.host =
        proxy.port = -1
        schema.reflection = false
        schema.registry.basic.auth.user.info = [hidden]
        schema.registry.ssl.cipher.suites = null
        schema.registry.ssl.enabled.protocols = [TLSv1.2, TLSv1.3]
        schema.registry.ssl.endpoint.identification.algorithm = https
        schema.registry.ssl.engine.factory.class = null
        schema.registry.ssl.key.password = null
        schema.registry.ssl.keymanager.algorithm = SunX509
        schema.registry.ssl.keystore.certificate.chain = null
        schema.registry.ssl.keystore.key = null
        schema.registry.ssl.keystore.location = null
        schema.registry.ssl.keystore.password = null
        schema.registry.ssl.keystore.type = JKS
        schema.registry.ssl.protocol = TLSv1.3
        schema.registry.ssl.provider = null
        schema.registry.ssl.secure.random.implementation = null
        schema.registry.ssl.trustmanager.algorithm = PKIX
        schema.registry.ssl.truststore.certificates = null
        schema.registry.ssl.truststore.location = null
        schema.registry.ssl.truststore.password = null
        schema.registry.ssl.truststore.type = JKS
        schema.registry.url = [https://schema-reg.some.url/subjects/versions/latest]
        use.latest.version = false
        use.schema.id = -1
        value.subject.name.strategy = class io.confluent.kafka.serializers.subject.TopicNameStrategy
 (io.confluent.connect.avro.AvroConverterConfig:376)
[2021-12-23 11:30:12,695] ERROR [mysql-sink|task-2] Failed to start task mysql-sink-2 (org.apache.kafka.connect.runtime.Worker:554)
java.lang.NoClassDefFoundError: com/google/common/base/Ticker
        at io.confluent.kafka.schemaregistry.client.CachedSchemaRegistryClient.<init>(CachedSchemaRegistryClient.java:170)
        at io.confluent.kafka.schemaregistry.client.CachedSchemaRegistryClient.<init>(CachedSchemaRegistryClient.java:153)
        at io.confluent.connect.avro.AvroConverter.configure(AvroConverter.java:73)
        at org.apache.kafka.connect.runtime.isolation.Plugins.newConverter(Plugins.java:277)
        at org.apache.kafka.connect.runtime.Worker.startTask(Worker.java:530)
        at org.apache.kafka.connect.runtime.standalone.StandaloneHerder.createConnectorTasks(StandaloneHerder.java:379)
        at org.apache.kafka.connect.runtime.standalone.StandaloneHerder.createConnectorTasks(StandaloneHerder.java:371)
        at org.apache.kafka.connect.runtime.standalone.StandaloneHerder.updateConnectorTasks(StandaloneHerder.java:405)
        at org.apache.kafka.connect.runtime.standalone.StandaloneHerder.lambda$null$2(StandaloneHerder.java:232)
        at java.base/java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:515)
        at java.base/java.util.concurrent.FutureTask.run(FutureTask.java:264)
        at java.base/java.util.concurrent.ScheduledThreadPoolExecutor$ScheduledFutureTask.run(ScheduledThreadPoolExecutor.java:304)
        at java.base/java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1128)
        at java.base/java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:628)
        at java.base/java.lang.Thread.run(Thread.java:829)
[2021-12-23 11:30:12,696] INFO Created connector mysql-sink (org.apache.kafka.connect.cli.ConnectStandalone:109)

And if i try to test in console-consumer, i will be able to receive data but like this:
4868fa8▒▒▒▒_9601FLHBK053A5T▒z+B▒▒▒▒▒▒
What could be wrong with my avro settings here?

Happy Holidays to everyone

Best
Francois

java.lang.NoClassDefFoundError: com/google/common/base/Ticker
at io.confluent.kafka.schemaregistry.client

This means the connector is using a bad version of Guava library, or it is otherwise missing from the classpath. Try seeing if a different version of the avro-converter works; otherwise, you’ll have to manually replace the guava JAR file on disk.

Make sure you are using kafka-avro-console-consumer

1 Like

thanks, got it working in the meantime. As well from your help at stackoverflow.
Cheers

2 Likes