Local Confluent Kafka env -> Mongo Atlas Sink Connect job add failing

Hi all

I have a local Confluent environment, trying to configure a mongo sink from a topic onto Mongo Atlas using the following code, it’s failing with very little error explanation

Error i’m getting back.

{“error_code”:400,“message”:“Connector configuration is invalid and contains the following 1 error(s):\nUnable to connect to the server.\nYou can also find the above list of errors at the endpoint /connector-plugins/{connectorType}/config/validate”}

… similar job to a local Mongo database works.
I can connect from local to the Mongo Atlas instance using mongosh, so it’s not name resolution or ports.

Nothing is reported via : confluent local services connect log.

curl -X POST \
  -H "Content-Type: application/json" \
  --data '
      {"name": "mongo-cloud-creator-payments-sink",
        "config": {
          "key.converter": "org.apache.kafka.connect.storage.StringConverter",
          "value.converter": "org.apache.kafka.connect.json.JsonConverter",
          "value.converter.schemas.enable": false,
      ' \
  http://localhost:8083/connectors -w "\n"

Nothing jumps out at me. Here are a few things to try…

From Mongo connection string docs here:

If both authSource and defaultauthdb are unspecified, the client will attempt to authenticate the specified user to the admin database.

So try being explicit with:


Also double check if this applies to your username or password:

If the username or password includes the following characters:

$ : / ? # [ ] @

those characters must be converted using percent encoding.

You could also increase the connection timeout (adding w and retryWrites too just because I see some examples with that, though based on doc they shouldn’e be necessary):

1 Like

thanks for all the ideas, none worked however, still getting same error_code=400


any chance anyone can post a example of their sink to Mongo Atlas connector add code.



any chance I can ask you to look over my shoulder at my Mongo Atlas settings and the settings being used on the connector add… if I am using a wrong value some where ?


Got it working… with some help…
Friendly guy Barry Evans :wink: pinged me (via Slack) and we had a look, he was replicating what i was doing his side and got his working… difference his CP stack was deployed inside docker via the docker-compose.yaml file, mine was as per previous the tar.gz file downloaded that is started using the confluent service command.
Well when I tried his modified json that worked on my env it failed, redeployed my environment, still failed.
so stop/deleted all, deployed the docker version, installed the mongo connector plugin using Robin Moffat’s video, retried the json and it worked…

Seems something is not 100% in the cp stack thats downloaded via tar.gz file.


This topic was automatically closed 7 days after the last reply. New replies are no longer allowed.