Note: I created a question for this on StackOverflow, but after not getting much response there, thought it may be better here: java - Creating test data from Confluent Control Center JSON representation - Stack Overflow
I am starting out trying to write Streams apps manipulating some records in Avro format.
As I don’t really want to create a lot of complicated objects manually in code, I’d prefer to be able to pick a message out from Control Center, save it to a file and instantiate an object from it in my unit tests.
I’m using SpecificRecords in my tests, and obviously Control Center output is in Avro-flavoured JSON, so I’d like to combine that JSON together with the schema file from ControlCenter to seed my input topic.
What I’ve tried so far is the following, which I attempted to adapt from a GenericRecord example:
var testAvroString = "{JSON copied from Control Center topic}";
Schema schema = price_assessment.getClassSchema();
DecoderFactory decoderFactory = new DecoderFactory();
Decoder decoder = null;
try {
DatumReader<price_assessment> reader = new SpecificDatumReader<price_assessment>();
decoder = decoderFactory.get().jsonDecoder(schema, testAvroString);
return reader.read(null, decoder);
} catch (Exception e)
{
return null;
}
however, running this returns Cannot invoke "org.apache.avro.Schema.equals(Object)" because "writer" is null
on the reader.read(...)
step.
Running on Java 17 / Streams 3.1.0