Kafkaconnect not seeing S3Sinkconnector

Hi

I have downloaded and added S3connector libs from confluent into folder
/usr/share/java/kafka/s3-connect

I can see all necessary jars under above location
Also added
/config/connect-standalone.properties
with
plugin.path=/usr/share/java/kafka/s3-connect

But when I run this command

curl -i -X PUT -H "Accept:application/json" \
    -H  "Content-Type:application/json" http://localhost:8083/connectors/sink/config \
    -d '
 {
        "connector.class": "io.confluent.connect.s3.S3SinkConnector",
        "key.converter":"org.apache.kafka.connect.storage.StringConverter",
        "tasks.max": "1",
        "topics": "cats",
        "s3.region": "eu-west-1",
        "s3.bucket.name": "snowplow-kafka-s3-sink-test",
        "flush.size": "65536",
        "storage.class": "io.confluent.connect.s3.storage.S3Storage",
        "format.class": "io.confluent.connect.s3.format.avro.AvroFormat",
        "schema.generator.class": "io.confluent.connect.storage.hive.schema.DefaultSchemaGenerator",
        "schema.compatibility": "NONE",
        "partitioner.class": "io.confluent.connect.storage.partitioner.DefaultPartitioner",
        "transforms": "AddMetadata",
        "transforms.AddMetadata.type": "org.apache.kafka.connect.transforms.InsertField$Value",
        "transforms.AddMetadata.offset.field": "_offset",
        "transforms.AddMetadata.partition.field": "_partition"
    }
'

I get error
HTTP/1.1 500 Internal Server Error
{"error_code":500, "message": "Failed to find any class that implements connector and which name matches io.confluent.connect.s3.S3SinkConnector"}

really struggling , I copied “kafka-connect-s3-10.5.9.jar” into different locations and added that location to CLASSPATH , normal PATH as well , the code can’t see the class
can somebody help what I’m doing wrong

type or paste code here

How did you get the connector? Download and unzip from the instructions here?

What does the worker log say when you run it? You should see output like this as it scans for plugins and registers connector classes (my plugin.path is /tmp/plugins):

...
[2024-05-23 13:55:14,152] INFO Scanning for plugin classes. This might take a moment ... (org.apache.kafka.connect.cli.AbstractConnectCli:127)
[2024-05-23 13:55:14,216] INFO Loading plugin from: /tmp/plugins/confluentinc-kafka-connect-s3-10.5.12 (org.apache.kafka.connect.runtime.isolation.PluginScanner:75)
[2024-05-23 13:55:14,505] INFO Registered loader: PluginClassLoader{pluginLocation=file:/tmp/plugins/confluentinc-kafka-connect-s3-10.5.12/} (org.apache.kafka.connect.runtime.isolation.PluginScanner:80)
...
[2024-05-23 13:55:22,167] INFO Added plugin 'io.confluent.connect.s3.S3SinkConnector' (org.apache.kafka.connect.runtime.isolation.DelegatingClassLoader:105)
...
[2024-05-23 13:55:22,179] INFO Added alias 'S3Sink' to plugin 'io.confluent.connect.s3.S3SinkConnector' (org.apache.kafka.connect.runtime.isolation.DelegatingClassLoader:109)
...

Here are the logs from my pod, as I work on AWS workspace, I can’t paste them here

loading plugin from : /usr/share/java/kafka/s3-connect (org.apache.kafka.connect.Runtime.isolation.PluginScanner:75)
Registered laoder: PluginClassLoader{pluginLocation=file:/usr/share/java/kafka/s3-connect}

WARN one or more plugins are missing ServiceLoader manifests may not be usable with pligin.discovery=service_load: [
file:/usr/share/java/kafka/avro/ io.confluent.connect.avro.AvroConverter converter undefined
file:/usr/share/java/kafka/s3-connect/ io.confluent.connect.s3.S3SinkConnector sink 10.5.9
file:/usr/share/java/kafka/s3-connect/ io.confluent.connect.storage.tools.SchemaSourceConnector source 3.6.0
.
.
.
down the lane
INFO Added plugin ‘io.confluent.connect.s3.S3SinkConnector’
.
.
.
INFO Added alias ‘S3SinkConnector’ to plugin ‘io.confluent.connect.s3.S3SinkConnector’

Then in log i can see reading s3.properties when creating connector
and gets Finished starting connectors and tasks
then i see
ERROR error forwarding REST request
org.apache.kafka.connect.runtime.test.errors.ConnectRestException: Failed to find any class that implements Connector and which name matches io.confluent.connect.s3.S3SinkConnector

But when i type in pod

curl http://localhost:8083/connector-plugins/

[{“class”:“io.confluent.connect.s3.S3SinkConnector”, “type”:“sink”,“version”:“10.5.9”} – and other plugins

where as

curl http://locahost:8083/connectors/

doesn’t output my connector name

yes the files are downloaded from Amazon S3 Sink Connector | Confluent Hub
and unzipped into plugin.path folder as mentioned above

Grasping at straws a bit, but try running a distributed worker rather than standalone and see if you get the same error.

what do you mean by distributed worker?

you mean to use , connect-distribute.properties ?

I deploy my docker image in k8s. The pod is attached with iamrole ,which can authenticate against MSK

Hi
I have changed to distributed properties, but still same error
Any advice pl