Apache kafka connect with AmazonS3

Use case

  1. Using windows machine

  2. I am connecting apache kafka connect with amazon-s3
    Kafka version-kafka_2.13-2.8.0
    Version-confluentinc-kafka-connect-s3-10.0.2
    Modified in properties

  3. While starting connector

connect-standalone.bat config\connect-standalone.properties config\quickstart-s3.properties

Error log-
[2021-09-15 15:33:28,056] ERROR Stopping due to error (org.apache.kafka.connect.cli.ConnectStandalone:126)
java.lang.NoClassDefFoundError: io/confluent/connect/storage/StorageSinkConnectorConfig

File-connectStatdalone.properties

bootstrap.servers=localhost:9092

key.converter=org.apache.kafka.connect.json.JsonConverter
value.converter=org.apache.kafka.connect.json.JsonConverter
# Converter-specific settings can be passed in by prefixing the Converter's setting with the converter we want to apply
# it to
key.converter.schemas.enable=true
value.converter.schemas.enable=true

offset.storage.file.filename=/tmp/connect.offsets
#offset.storage.file.filename=E:/kafka_2.13-2.8.0/test.txt
# Flush much faster than normal, which is useful for testing/debugging
offset.flush.interval.ms=10000

 plugin.path=E:/opt/connectors/kafka-connect-s3-4/lib

quickstart-s3.properties

name=s3-sink
connector.class=io.confluent.connect.s3.S3SinkConnector
tasks.max=1
s3.credentials.provider.class=com.amazonaws.auth.DefaultAWSCredentialsProviderChain
topics=s3_bucket
aws.access.key.id=<redacted>
aws.secret.access.key=<redacted>
s3.region=ap-south-1
s3.bucket.name=kafkabucket3
s3.part.size=5242880
flush.size=3

storage.class=io.confluent.connect.s3.storage.S3Storage
#format.class=io.confluent.connect.s3.format.avro.AvroFormat
format.class=io.confluent.connect.s3.format.json.JsonFormat
schema.generator.class=io.confluent.connect.storage.hive.schema.DefaultSchemaGenerator
partitioner.class=io.confluent.connect.storage.partitioner.DefaultPartitioner

schema.compatibility=NONE

Please help

How did you install the connector? This kind of error usually comes about when the connector has not been installed correctly. ref: https://www.youtube.com/watch?v=18gDPSOH3wU

Also you should mask your AWS keys - I’ve removed them from your post :slight_smile:

Thank a lot ,That issue got resolved .Coonector i have downloaded from confluent site-AmazonS3 sink ->Extracted zip file and pointed the lib to the plugin path from connectstandalone.properties.

Passing input from producer normal json

Now i am facing issue,kindly suggest

INFO Errant record reporter not configured. (io.confluent.connect.s3.S3SinkTask:123)
Error 2
INFO [Consumer clientId=connector-consumer-s3-sink-0, groupId=connect-s3-sink] Found no committed offset for partition kafkabucket3-2 (org.apache.kafka.clients.consumer.inter
nals.ConsumerCoordinator:1354)

That’s an INFO not an error message – is there a particular failure that you’re seeing?

No i am not seeing any failure in logs apart from this.But once i verify in amazon console inside my S3 bucket nothing is written there(no object ,it is empty).

A post was split to a new topic: Kafka Connect deserialization error

This topic was automatically closed 30 days after the last reply. New replies are no longer allowed.