How to "apply" schema to current Kafka topic (raw string json data)?

I have a topic Kafka with string-like JSON data like below
“”"
{
“request_host”:“xx.xx.xx.xx”,
“request_uri”:“/admin/customer_groups.json?page=1&limit=200&modified_on_min=2025-06-20T09:00:00.000Z”,
“upstream_response_time”:0.008,
“upstream_addr”:“xx.xx.xx.xx”,
“forwarded_ip”:“-”,
@timestamp”:“2025-06-20T09:47:09.000Z”,
“request_referer”:“https://xx.xx.xx.xx/offline-pos-v2/web-worker.js?v=202105311042”,
“response_body_size”:100,
“request_useragent”:“xx.xx.xx.xx”,
“response_status”:200,
@version”:“1”,
“request_method”:“GET”,
“client_ip”:“xx.xx.xx.xx”,
“sapo_client”:“-”,
“request_time”:0.011
}
“”"
How to apply a schema-registry to that topic, or create a new topic with a schema from the above topic? So that I can query data as a table in KSQL?
Please note that the original topic is just a raw binary string (but with proper JSON format), and it’s managed by another dev. I can only consume the data from it, can’t change anything in the source.
As far as I understand, to have the Schema, I have to convert the binary data to Avro format or something. How to do that?
I’m thinking about using some Kafka-Source-Connector, read data from the original topic, apply schema, convert to Avro, then write to the destination topic with the schema.
The problem is that there aren’t any Connectors with Kafka as a Source; my only choices are Replicator or MirrorMaker. Repicator is not open-source, while MirrorMaker only supports copying data 1-1.
Any advice?

I would write a service, perhaps a streaming service, that reads the original topic, and then writes the data into an Avro or Json Schema output topic. You could use whatever language or framework you’re comfortable with to do that.