S3 sink connector not pushing data to s3

i have setup multi node apache kafka in three ec2 instances and tried adding confluent s3 sink plugin and applied connector config its running fine but when i publish data to topic no data is coming in s3 its empty.

any thing need to check for this.

my connector config:

“name”: “s3-sink”,
“config”: {
“connector.class”: “io.confluent.connect.s3.S3SinkConnector”,
“tasks.max”: “1”,
“topics”: “s3_topic”,
“s3.region”: “us-west-2”,
“s3.bucket.name”: “confluent-kafka-connect-s3-testing”,
“s3.part.size”: “5242880”,
“flush.size”: “3”,
“storage.class”: “io.confluent.connect.s3.storage.S3Storage”,
“format.class”: “io.confluent.connect.s3.format.avro.AvroFormat”,
“schema.generator.class”: “io.confluent.connect.storage.hive.schema.DefaultSchemaGenerator”,
“partitioner.class”: “io.confluent.connect.storage.partitioner.DefaultPartitioner”,
“schema.compatibility”: “NONE”,
“name”: “s3-sink”

data am publishing like below:

{“f1”: “value1”}
{“f1”: “value2”}
{“f1”: “value3”}
{“f1”: “value4”}
{“f1”: “value5”}
{“f1”: “value6”}
{“f1”: “value7”}
{“f1”: “value8”}
{“f1”: “value9”}

even i tried json converter without ay schema still no luck

am seeing below error in connector logs

Cancelled in-flight METADATA request with correlation id 439 due to node -2 being disconnected

The error is saying your connect worker is unable to connect to kafka broker. It’s unrelated to the connector config

This topic was automatically closed 30 days after the last reply. New replies are no longer allowed.