Hello pretty new to confluent. But I do have a elasticsearch connector set up that works great. I am looking to create my own schema for this json that is being pushed to kafka.
My first question is can my dev set a schema before they push the data to kafka?
If this is not an option is there a good guide to set up a schema or mappings?
You may have already figured this out, but yes, you can set a schema before sending data to Kafka. Using the Confluent Schema Registry you can create schemas in JSON Schema, Avro, or Protobuf.
Hi @daveklein , I have a similar question around this. I am trying to send some json records to confluent kafka topic and I want my schema to get registered. I am using kafka-json-schema-serializer but the json record field names are not getting registered properly.