Kafka Connect deserialization error

Hi

i am trying the same thing with avro data from topic to S3 bucket.

I am getting data desterilization error.

Are you able do this with JSON data.

Thanks
Rama

Hi there, could you please reply with your entire connector config? I suspect something is not quite right there. Thanks!

Hello Danicafine,

Firstly,Thanks for reachout to me to help me resolve this issue.

I am trying to move the stream “JSON” data present in Kafka Topic to AWS S3 Bucket.

Kafka Server Config file : server.properties

Kafka Connect Config file : connect-distributed.properties

S3-Connector Sink Config file : quickstart-s3.properties

Issue : Data is not getting inserted in Json extension i.e connect-test.json into AWS S3 Bucket.

When I query the .bin file I can see the topic data.

S3-sink Config

name=s3-sink
connector.class=io.confluent.connect.s3.S3SinkConnector
tasks.max=1
topics=connect-test
connector.class=io.confluent.connect.s3.S3SinkConnector
s3.credentials.provider.class=com.amazonaws.auth.DefaultAWSCredentialsProviderChain
s3.region=us-east-2
s3.bucket.name=snowflakelab2
s3.part.size=5242880
flush.size=3
#s3.credentials.provider.class=AwsAssumeRoleCredentialsProvider
s3.credentials.provider.access_key_id = 
s3.credentials.provider.secret_access_key = 
sts.role.arn = 
sts.role.session.name = mysnowflakerole
format.class=io.confluent.connect.s3.format.bytearray.ByteArrayFormat
value.converter=org.apache.kafka.connect.converters.ByteArrayConverter
storage.class=io.confluent.connect.s3.storage.S3Storage
#format.class=io.confluent.connect.s3.format.avro.AvroFormat
#value.converter.schema.registry.url = http://localhost:8081
#format.class=io.confluent.connect.s3.format.json.JsonFormat
partitioner.class=io.confluent.connect.storage.partitioner.DefaultPartitioner
#partitioner.class=io.confluent.connect.storage.partitioner.TimeBasedPartitioner
schema.generator.class = io.confluent.connect.storage.hive.schema.DefaultSchemaGenerator
schema.compatibility=NONE
#partition.field.name=
#partition.duration.ms=
#path.format=
#locale=
#timezone=

Kafka Connect : connect-distributed.properties file

# A list of host/port pairs to use for establishing the initial connection to the Kafka cluster.
bootstrap.servers=localhost:9092

# unique name for the cluster, used in forming the Connect cluster group. Note that this must not conflict with consumer group IDs
group.id=connect-cluster

# The converters specify the format of data in Kafka and how to translate it into Connect data. Every Connect user will
# need to configure these based on the format they want their data in when loaded from or stored into Kafka
key.converter=org.apache.kafka.connect.json.JsonConverter
value.converter=org.apache.kafka.connect.json.JsonConverter

value.converter.schemas.enable=false
key.converter.schemas.enable=false

offset.storage.topic=connect-offsets
offset.storage.replication.factor=1
#offset.storage.partitions=25

config.storage.topic=connect-configs
config.storage.replication.factor=1

status.storage.topic=connect-status
status.storage.replication.factor=1
#status.storage.partitions=5

# Flush much faster than normal, which is useful for testing/debugging
offset.flush.interval.ms=10000



plugin.path=/usr/share/java,/home/bhargav/confluent-6.2.0/share/confluent-hub-components,/opt/connectors

Thanks

Kafka-Connect

S3-Sink Config