However, I want to build the schema programmatically using something like your org.apache.afka.connect.data.SchemaBuilder but I can’t figure out how to then get the JSON string from the SchemaBuilder/Schema objects to pass into the connector config as shown above.
Any idea how I can do that, or what’s the best way to programmatically build a schema to then get the required Kafka Connect JSON string to pass to the spooldir connector as the value.schema property?
You can POST connector configs using the HTTP API (or POST to update it), so you could use an HTTP client in the application that is creating the schema and update the connector configs with that. You can find more info on the RES here: Connect REST Interface | Confluent Documentation
Thanks for getting back to me. Yes, I’m using the Connect REST API to configure my connectors already, the question is how do I create that schema string that I add to the config to create the spooldir connector?
I don’t want to create that string by hand, I was hoping to use a Java object like your SchemaBuilder to build my CSV schema programmatically and then somehow get the ‘string’ version of it to then add to the config and then use the REST API to create the connector. However, the SchemaBuilder.toString() doesn’t create the desired output.
Is there something that I can use to generate that JSON string from a Schema object?
I haven’t had time to try this out, but you could try using the JsonConverter’s .asJsonSchema method to convert the schema provided by the SchemaBuilder to a JSON ObjectNode and then get the string from that. Not sure it would provide a valid JSON string, but it’s worth a try.