Can you produce Avro data with Kafkacat?

I can see, from the docs, how to consume records, in Avro format, with Kafkacat, but I don’t see any info on how to produce them. I love using Kafkacat to, quickly, throw some test records onto a topic, but can’t figure out how to do so with Avro. :frowning:

Don’t think this is supported yet with kafkacat: https://github.com/edenhill/kafkacat/issues/226

2 Likes

While kafkacat doesn’t support publishing Avro, you can roll your own.

You can leverage Avro’s only tool jar to take JSON and write it to Avro binary and then publish it with the magic byte and Schema Registry ID yourself.

The script here assumes jq to quickly obtain the Schema Registry ID. It also assumes you have Java installed and Avro’s tool jar downloaded.

There are also hard-coded assumptions in this example, but I hope it can give you ideas if you want to publish Avro. Personally, I like it that it helps the understanding of how Avro is marshaled (what the Confluent serializer does in addition to Avro serialization).

I also used Michael Noll’s blog from 2013 to get content for this example along with how to run the tools command. It is a great article and worth the read if you are using Avro, Reading and Writing Avro Files from the Command Line.

#!/bin/sh

TOPIC=foo2
AVRO_SCHEMA_FILE=twitter.avsc
PAYLOAD=twitter.json

#
# Take the Avro Schema and convert it into the format that SR accepts
#
SCHEMA=$(cat ${AVRO_SCHEMA_FILE} | sed 's/"/\\\"/g' | tr -d '\n')

SR=$( cat <<EOF
{
  "schema": "${SCHEMA}"
}
EOF
)


#
# Publish Schema with Schema Registry
#
ID=$(curl -s -X POST -H "Content-Type: application/json" --data "$SR" http://localhost:8081/subjects/${TOPIC}-value/versions | jq .id)

echo "Schema Registered As : $ID"

rm data_file

# Magic Byte 0x00
printf '\x00' > data_file

# Schema Registry ID as 32bit WORD
printf "0: %.8x" $ID | xxd -r -g0 >> data_file

# Convert JSON payload into Avro using Avro's own toolage, write only the binary data of Avro to the file
java -jar ./avro-tools-1.10.0.jar jsontofrag --schema-file ${AVRO_SCHEMA_FILE} ${PAYLOAD} >> data_file

# Publish the file using kafkacat add /dev/null to ensure file is treated as a single message
kafkacat -P -e -b localhost:19092 -t ${TOPIC} data_file /dev/null
5 Likes

Wow, nicely done! Turns out you can teach an old dog (me) new kafkacat tricks :smiley:

2 Likes

Wow! Very cool! Thanks!

Looks complicated, it should be easy te make it easier making a cli from https://crates.io/crates/schema_registry_converter. There is already something like it, but that is only supporting protobuf, https://crates.io/crates/ksrt.

Wow, I wish I’d found this post earlier.
I’d been struggling with how to post a tombstone message with an Avro key, via command-line tools for a POC.

My first go to was “kafka-avro-console-producer”, but that doesn’t support producing tombstones.
I then tried “kafkacat” but that doesn’t support producing Avro.
Eventually I found success using Kafka’s REST Proxy. Posting a message to the proxy and specifying only a key_schema (or key_schema_id), and an array of records which only have a key (no value), succeeds in producing a tombstone with an Avro key.

curl -s -X POST -H "Content-Type: application/vnd.kafka.avro.v2+json" \
        -H "Accept: application/vnd.kafka.v2+json" \
         --data '{ "key_schema_id": '"$key_schema_id"', "records": [{"key": {"unit_number":"unit1", "bed_number":"A", "renewal": false, "start_date": 1610634044000}}]}' \
        "http://localhost:8082/topics/TEST"

Kudos for the creative solution with kafkacat! I’m sharing the above in case others stumble across this post while looking (as I was) for “how to produce an Avro-keyed tombstone via command line tools”.

3 Likes