Sparkplug Schema

Not sure if this is the correct topic to post this to as it involves streams, schema registry, and connect…

I have topic that is currently configured with the default AVRO schema that treats incoming messages as bytes. The compatibility check for the schema is set to “None”.

I’ve also configured an io.confluent.connect.mqtt.MqttSourceConnector that receives MQTT messages that follow the Sparkplug B protocol. With the AVRO bytes schema in place, I can see a steady stream of messages being written to the topic.

The payload is actually an MQTT message that follows the Sparkplug B protocol, for which there is a .proto schema defined here. When I put this schema in place on the topic, messages are no longer written to the topic. Note that I deleted the topic and recreated it so that the first schema set on it is the protobuf schema.

I’m not sure what’s going on. One clue, I think, is that when I have the AVRO bytes schema in place, it looks like there is constantly a few extra bytes tacked on to the front of the payload. My current theory is that these are the Magic byte, schema id, and message ids, which are described here in the Wire Format section.

I must be missing something. Does anyone have advice on how to make the schema registry work in this scenario?


there is a .proto schema defined

Can you clarify why you mention Avro, then? The schema registry supports Protobuf as well.