I am new to Kafka and currently going thru training on Schema Registry.
I am at step#12 on " Hands On: Integrate Schema Registry with Clients". When I ran the command “gradlew runProducer” AND “gradlew runConsumer” on my Windows Laptop, I got the following error message. Any help is greatly appreciated.
Task :runProducer FAILED
Producer now configured for using SchemaRegistry
Exception in thread “main” org.apache.kafka.common.config.ConfigException: Invalid value StringSerializer.class for configuration key.serializer: Class StringSerializer.class could not be found.
at org.apache.kafka.common.config.ConfigDef.parseType(ConfigDef.java:744)
at org.apache.kafka.common.config.ConfigDef.parseValue(ConfigDef.java:490)
at org.apache.kafka.common.config.ConfigDef.parse(ConfigDef.java:483)
at org.apache.kafka.common.config.AbstractConfig.(AbstractConfig.java:113)
at org.apache.kafka.common.config.AbstractConfig.(AbstractConfig.java:133)
at org.apache.kafka.clients.producer.ProducerConfig.(ProducerConfig.java:513)
at org.apache.kafka.clients.producer.KafkaProducer.(KafkaProducer.java:292)
at org.apache.kafka.clients.producer.KafkaProducer.(KafkaProducer.java:275)
at io.confluent.developer.ProducerApp.producePurchaseEvents(ProducerApp.java:50)
at io.confluent.developer.ProducerApp.main(ProducerApp.java:91)
FAILURE: Build failed with an exception.
What went wrong:
Execution failed for task ‘:runProducer’.
Process ‘command ‘java’’ finished with non-zero exit value 1
Try:
Run with --stacktrace option to get the stack trace. Run with --info or --debug option to get more log output. Run with --scan to get full insights.
Task :runConsumer FAILED
Exception in thread “main” org.apache.kafka.common.config.ConfigException: Invalid value StringDeserializer.class for configuration key.deserializer: Class StringDeserializer.class could not be found.
Task :runConsumer
Exception in thread “main” org.apache.kafka.common.errors.RecordDeserializationException: Error deserializing key/value for partition proto-purchase-4 at offset 0. If needed, please seek past the record to continue consumption.
at org.apache.kafka.clients.consumer.internals.Fetcher.parseRecord(Fetcher.java:1448)
at org.apache.kafka.clients.consumer.internals.Fetcher.access$3400(Fetcher.java:135)
at org.apache.kafka.clients.consumer.internals.Fetcher$CompletedFetch.fetchRecords(Fetcher.java:1671)
at org.apache.kafka.clients.consumer.internals.Fetcher$CompletedFetch.access$1900(Fetcher.java:1507)
at org.apache.kafka.clients.consumer.internals.Fetcher.fetchRecords(Fetcher.java:733)
at org.apache.kafka.clients.consumer.internals.Fetcher.fetchedRecords(Fetcher.java:684)
at org.apache.kafka.clients.consumer.KafkaConsumer.pollForFetches(KafkaConsumer.java:1277)
at org.apache.kafka.clients.consumer.KafkaConsumer.poll(KafkaConsumer.java:1238)
at org.apache.kafka.clients.consumer.KafkaConsumer.poll(KafkaConsumer.java:1211)
at io.confluent.developer.ConsumerApp.consumePurchaseEvents(ConsumerApp.java:43)
at io.confluent.developer.ConsumerApp.main(ConsumerApp.java:69)
Caused by: org.apache.kafka.common.errors.SerializationException: Error retrieving Avro unknown schema for id 100002
at io.confluent.kafka.serializers.AbstractKafkaAvroDeserializer$DeserializationContext.schemaFromRegistry(AbstractKafkaAvroDeserializer.java:333)
at io.confluent.kafka.serializers.AbstractKafkaAvroDeserializer.deserialize(AbstractKafkaAvroDeserializer.java:113)
at io.confluent.kafka.serializers.KafkaAvroDeserializer.deserialize(KafkaAvroDeserializer.java:55)
at org.apache.kafka.common.serialization.Deserializer.deserialize(Deserializer.java:60)
at org.apache.kafka.clients.consumer.internals.Fetcher.parseRecord(Fetcher.java:1439)
… 10 more
Caused by: java.net.MalformedURLException: no protocol: <https://psrc-68gz8.us-east-2.aws.confluent.cloud/schemas/ids/100002?fetchMaxId=false&subject=
at java.base/java.net.URL.(URL.java:645)
at java.base/java.net.URL.(URL.java:541)
at java.base/java.net.URL.(URL.java:488)
at io.confluent.kafka.schemaregistry.client.rest.RestService.sendHttpRequest(RestService.java:262)
at io.confluent.kafka.schemaregistry.client.rest.RestService.httpRequest(RestService.java:367)
at io.confluent.kafka.schemaregistry.client.rest.RestService.getId(RestService.java:836)
at io.confluent.kafka.schemaregistry.client.rest.RestService.getId(RestService.java:809)
at io.confluent.kafka.schemaregistry.client.CachedSchemaRegistryClient.getSchemaByIdFromRegistry(CachedSchemaRegistryClient.java:277)
at io.confluent.kafka.schemaregistry.client.CachedSchemaRegistryClient.getSchemaBySubjectAndId(CachedSchemaRegistryClient.java:409)
at io.confluent.kafka.serializers.AbstractKafkaAvroDeserializer$DeserializationContext.schemaFromRegistry(AbstractKafkaAvroDeserializer.java:330)
… 14 more
Task :runConsumer FAILED
FAILURE: Build failed with an exception.
What went wrong:
Execution failed for task ‘:runConsumer’.
Process ‘command ‘java’’ finished with non-zero exit value 1
Are you using Windows Subsystem for Linux (WSL)? When using a Windows laptop, you should run Apache Kafka using either WSL or run it in Docker Linux containers.
If you are already using WSL or Docker, I have not been able to reproduce your error when I follow the exercise steps. If you can provide additional details, perhaps they will help me resolve your issue.
Thanks for the reply. As mentioned, I am at step 13. " Run command ./gradlew runConsumer.". on [course: Schema Registry 101 - # Hands On: Integrate Schema Registry with Clients.
By the following instruction about updating the file “ConsumerApp.java” as shown below.
consumerConfigs.put(ConsumerConfig.KEY_DESERIALIZER_CLASS_CONFIG, StringDeserializer.class);
consumerConfigs.put(ConsumerConfig.VALUE_DESERIALIZER_CLASS_CONFIG, KafkaProtobufDeserializer.class);
consumerConfigs.put(KafkaProtobufDeserializerConfig.SPECIFIC_PROTOBUF_VALUE_TYPE, Purchase.class );
I am getting errors, as shown in the attached image.
I am stuck on that error from yesterday. Again any help is greatly appreciated.