Confluent failed to find any class that implements connector

Good day,

I go to https://www.confluent.io/installation/ , and download the Confluent platform in ZIP file under “Local”, after unzip it, I continue to start the confluent platform by the following command:

confluent local services start

I can see all the service started correctly, and I able to view my control center throw browser.

And this is the folder structure of my Confluent platform:

enter image description here

After that, I referring to https://docs.confluent.io/kafka-connect-sftp/current/source-connector/index.html#configuration-properties , to install the sftp connector.

I install the connector by the following command:

confluent-hub install confluentinc/kafka-connect-sftp:latest

Command run successfully, and I can see the folder created in /share/confluent-hub-components enter image description here

Base on the guide from confluent website, I need to create 1 file call sftp.json , and then load the connector by following command:

confluent local services connect connector load CsvSFTP --config sftp.json

However, I am hitting error like follow:

[meow@localhost confluent-7.0.1]$ confluent local services connect connector load CsvSFTP --config sftp.json
The local commands are intended for a single-node development environment only,
NOT for production usage. https://docs.confluent.io/current/cli/index.html

{
  "error_code": 500,
  "message": "Failed to find any class that implements Connector and which name matches io.confluent.connect.sftp.SftpCsvSourceConnector, available connectors are: PluginDesc{klass=class io.confluent.connect.replicator.ReplicatorSourceConnector, name='io.confluent.connect.replicator.ReplicatorSourceConnector', version='7.0.1', encodedVersion=7.0.1, type=source, typeName='source', location='file:/home/meow/Workspace/confluentPlatform/confluent-7.0.1/share/java/kafka-connect-replicator/'}, PluginDesc{klass=class org.apache.kafka.connect.mirror.MirrorCheckpointConnector, name='org.apache.kafka.connect.mirror.MirrorCheckpointConnector', version='1', encodedVersion=1, type=source, typeName='source', location='file:/home/meow/Workspace/confluentPlatform/confluent-7.0.1/share/java/kafka/'}, PluginDesc{klass=class org.apache.kafka.connect.mirror.MirrorHeartbeatConnector, name='org.apache.kafka.connect.mirror.MirrorHeartbeatConnector', version='1', encodedVersion=1, type=source, typeName='source', location='file:/home/meow/Workspace/confluentPlatform/confluent-7.0.1/share/java/kafka/'}, PluginDesc{klass=class org.apache.kafka.connect.mirror.MirrorSourceConnector, name='org.apache.kafka.connect.mirror.MirrorSourceConnector', version='1', encodedVersion=1, type=source, typeName='source', location='file:/home/meow/Workspace/confluentPlatform/confluent-7.0.1/share/java/kafka/'}, PluginDesc{klass=class org.apache.kafka.connect.tools.MockConnector, name='org.apache.kafka.connect.tools.MockConnector', version='7.0.1-ce', encodedVersion=7.0.1-ce, type=connector, typeName='connector', location='file:/home/meow/Workspace/confluentPlatform/confluent-7.0.1/share/java/acl/'}, PluginDesc{klass=class org.apache.kafka.connect.tools.MockSinkConnector, name='org.apache.kafka.connect.tools.MockSinkConnector', version='7.0.1-ce', encodedVersion=7.0.1-ce, type=sink, typeName='sink', location='file:/home/meow/Workspace/confluentPlatform/confluent-7.0.1/share/java/acl/'}, PluginDesc{klass=class org.apache.kafka.connect.tools.MockSourceConnector, name='org.apache.kafka.connect.tools.MockSourceConnector', version='7.0.1-ce', encodedVersion=7.0.1-ce, type=source, typeName='source', location='file:/home/meow/Workspace/confluentPlatform/confluent-7.0.1/share/java/acl/'}, PluginDesc{klass=class org.apache.kafka.connect.tools.SchemaSourceConnector, name='org.apache.kafka.connect.tools.SchemaSourceConnector', version='7.0.1-ce', encodedVersion=7.0.1-ce, type=source, typeName='source', location='file:/home/meow/Workspace/confluentPlatform/confluent-7.0.1/share/java/acl/'}, PluginDesc{klass=class org.apache.kafka.connect.tools.VerifiableSinkConnector, name='org.apache.kafka.connect.tools.VerifiableSinkConnector', version='7.0.1-ce', encodedVersion=7.0.1-ce, type=source, typeName='source', location='file:/home/meow/Workspace/confluentPlatform/confluent-7.0.1/share/java/acl/'}, PluginDesc{klass=class org.apache.kafka.connect.tools.VerifiableSourceConnector, name='org.apache.kafka.connect.tools.VerifiableSourceConnector', version='7.0.1-ce', encodedVersion=7.0.1-ce, type=source, typeName='source', location='file:/home/meow/Workspace/confluentPlatform/confluent-7.0.1/share/java/acl/'}"

Looks like my confluent platform did not load the connector that I install, then I go to edit the plugin.path value in connect-standalone.properties , add in the newly install connector path:

plugin.path=/usr/share/java,/home/meow/Workspace/confluentPlatform/confluent-7.0.1/share/confluent-hub-components

And then follow by restart my confluent platform, and try again, result still same, error message still same. Please advise which mistake I have make.

And how can I know the confluent-standalone.properties is being loaded when starting Confluent Platform, because I check my log in /home/meow/Workspace/confluentPlatform/confluent-7.0.1/logs , its not append there unless schema-registry.log .

kindly advise.

After running this you need to restart the Kafka Connect worker. You shouldn’t need to edit connect-standalone.properties if the Confluent Hub client install worked correctly.

Good day,
May I know is the command correct?

confluent local services connect stop
confluent local services connect start

Yes, that seems right, see confluent local services connect stop | Confluent Documentation

Good day,
Actually I did restart, but still the same.
I just get a solution from stack over flow there, I edit my plugin.path in etc/schema-registry/connect-avro-distributed.properties, restart, and then its work.
The thing I not understand is, how come its related to schema registry? I thought it suppose to be the standalone ones?

When you launch the connect worker it uses that config file - which is why it worked for you when you changed it. I’m still puzzled why the Confluent Hub client didn’t correctly update it for you.

The reason it’s that file is kinda convoluted. It goes something like this:

  • Apache Kafka doesn’t ship with Avro support
  • Confluent CLI launches Kafka Connect worker in distributed mode with Avro support
  • Schema Registry provides the Avro support
  • Therefore the config file for Kafka Connect has to reside in the Schema Registry folder
1 Like

This topic was automatically closed 7 days after the last reply. New replies are no longer allowed.