I am running a kafka JDBC sink connector with the following properties:
{
"connector.class": "io.confluent.connect.jdbc.JdbcSinkConnector",
"table.name.format": "events",
"connection.password": "******",
"tasks.max": "1",
"topics": "events",
"value.converter.schema.registry.url": "http://IP:PORT",
"db.buffer.size": "8000000",
"connection.user": "postgres",
"name": "cp-sink-events",
"value.converter": "io.confluent.connect.avro.AvroConverter",
"connection.url": "jdbc:postgresql://IP:PORT/postgres?stringtype=unspecified",
"insert.mode": "upsert",
"pk.mode": "record_value",
"pk.fields": "source,timestamp,event,event_type,value"
}
It was working fine before, but since this week I have been getting the following errors while trying to sink my data to Postgres:
Caused by: org.apache.kafka.common.errors.SerializationException: Error retrieving Avro value schema for id 4
Caused by: java.net.SocketTimeoutException: connect timed out
It appears my kafka connect cannot acces my schema registry server anymore. I coulnd’t manage to figure out why or how. I have tried multiple things but yet to find the solution. I did install NGINX on this VM over last week, and killed apache2 which was running on port 80. But I haven’t found any dependencies that this would cause any problems.
When I curl the schema registry address from the VM to retrieve the schemas of the mentioned IDs it works fine (http://IP:PORT/schemas/ids/4). any clue how to proceed?