Topic: protnbuf encoded, sinked to Mongo

Hi all…

I’m trying to sink a topic thats protnbuf based using the following curl command… it’s failing… other than having installed the mongo source/sink connector in my connect image… must I install anything else ?

=> ‘io.confluent.connect.protobuf.ProtobufConverter’
I found through some reading i need the above line… wondering if this is bt default not part of the connect image. is this installed by installing this into the connector : “

curl -X POST \
  -H "Content-Type: application/json" \
  --data '
      {"name": "mongo-local-salespayments-sink-pb",
        "config": {
          "key.converter": "",
          "value.converter.schemas.enable": true,
      ' \
  http://localhost:8083/connectors -w "\n"

It is installed by default in the connect image, specifically at /usr/share/java/kafka-serde-tools/kafka-connect-protobuf-converter-<VERSION>.jar. Is this JAR not present in your image? Or, is that path not included in the worker classpath? What error message are you seeing?

might be a couple of things… did the install… still had problems. figured out that the schema registry i needed to point to the service name and not localhost:8081.


old think anything container don’t use localhost…

will give it a go with the old container without me having installed the lib… to see if this was all me or something about the path or just the service name.


… can confirm problem was caused by me… i switched back to my connector without the proto lib manually installed…

it still works, so the problem was my usage of localhost:8081 as the pointer to the schema registry, which was fixed by switching localhost out for the service name.


1 Like

This topic was automatically closed 30 days after the last reply. New replies are no longer allowed.