Hello,
I’m using a postgres sink connector to send data from a topic to my timescale database.
when i send the messages from my iot device through mqtt the message arrives to my topic and goes through the postgres sink connector but it never reachs the database. this is the statut of my connector when i send data, all the messages are stuck in DLQ :
thank you in advance
hey @Oussgasmi
could you share your config?
what does
confluent connect list
say?
did you check status via api?
Yes, i double checked and it says that my postgres sink connector status in running and is on type sink.
I just found out the message error that i get in my messages in the dlq topic of the connector:
"Converting byte[] to Kafka Connect data failed due to serialization error of topic temperature:
"
this is the error message that i got.
ok and how does your config look like?
this is my connector config :
{
"name": "PostgresSinkConnector_0",
"config": {
"topics": "temperature",
"input.data.format": "JSON_SR",
"connector.class": "PostgresSink",
"name": "PostgresSinkConnector_0",
"kafka.auth.mode": "KAFKA_API_KEY",
"connection.host": "",
"connection.port": "",
"connection.user": "",
"db.name": "",
"ssl.mode": "require",
"insert.mode": "INSERT",
"table.name.format": "",
"table.types": "",
"db.timezone": "Europe/Paris",
"auto.create": "true",
"auto.evolve": "true",
"quote.sql.identifiers": "ALWAYS",
"tasks.max": "1"
}
}
and this is my topic schema :
{
"$id": "http://example.com/myURI.schema.json",
"$schema": "http://json-schema.org/draft-07/schema#",
"additionalProperties": false,
"description": "Sample schema to help you get started.",
"properties": {
"etat_cabine": {
"description": "The number type is used for any numeric type, either integers or floating point numbers.",
"type": "number"
},
"sitename": {
"description": "The integer type is used for integral numbers.",
"type": "string"
},
"temperature_prise_d_air": {
"description": "The string type is used for strings of text.",
"type": "number"
}
},
"title": "Cabine",
"type": "object"
}
ok I see
any errors on Postgres side?
best,
michael
No, because the data never reaches the postgres database. it always stops at the dlq.
ok I see
is a test data set available?
could try it locally
or if you have a local dev env available I would recommend to try it there to see the error messages
this is the error that is keeping the data from going to the database :
"Converting byte to Kafka Connect data failed due to serialization error of topic temperature: "
maybe it’s related to my json schema that i created for the topic.
ok I see
might be the case.
any testdata available for testing?
i don’t have a test data but i have a python script that sends mqtt json strings through mqtt proxy to the cluster on the cloud. from the mqtt message sent from the python script to the kafka cluster on the confluent cloud everything works fine and i receive the message on the cloud. but when i try to send the message from the cluster to my timescaledb database through the postgres sink connector, the message never arrives to the database but it gets stuck in the dlq.
ok
I’m asking for testdata cause I would like to reproduce locally
will check my local env, created a similar setup some time ago
best,
michael
Oh thank you for this effort.
what i send to the kafka cluster through the mqtt-proxy is a string of type Json. I sent it in loop with a python script or using mosquitto.
our real data will have the same form as this test data. our routers will send a json string every 2 minutes to the cluster and this is what the string looks like :
{
"SiteName": "AF123456-Test",
"timestamp": 12654585,
"Etat_cabine": 65678,
"temperature_prise_d_air": -645,
"temperature_cabine": 4096
}
any updates on your tests ?
thank you.
not yet did not made
will check later on
best,
michael
Okey thank you for your time.
no worries
did you check already what’s in the dlq?
should provide an error message as well, might be more detailed?
best,
michael
Yes, and this is the error message that i get in the dlq :
" Converting byte[] to Kafka Connect data failed due to serialization error of topic temperature:
"
ok I see
at least I’m getting the same error locally
further reading to check
oh great ! and did you manage to solve it ?