How do I ingest csv (schema/no schema) to postgres

Hello,

I am using kafka-connect-Spool-Dir and I’m finding trouble in configuring it. I want to send the consumer output to postgres db. I have used “org.apache.kafka.connect.storage.StringConverter”. Now I don’t know how to define the input file path, failed path etc. and how to use that third file. After installing the Spool-Dir connector, I got CSVExample.properties file, I am guessing I can make use of it.
Also, I have copied all the jars to the /usr/local/share/kafka/plugins location.

I’m assuming I can use them like this-
“bin/connect-standalone connect-standalone.properties CSVExample.properties
Is this the right way to use it?

Attaching my config files.


![csvexample_properties|690x390])

Second attachment adding here.

Hi @Antrikshhii17, welcome to the forum! :slight_smile:

If you want to stream the data to a database, it’ll need a schema added to it at some point.

Here are a few references to get you going:

2 Likes

This topic was automatically closed 30 days after the last reply. New replies are no longer allowed.