Heroku Kafka Connect to snowflakes using docker file is not working

Hello, I have Created a Docker File to install Kafka Connect and snowflake Kafka Connector on Heroku and setup the Environment This is the docker File.

FROM confluentinc/cp-kafka-connect:5.5.3
ENV CONNECT_PLUGIN_PATH="/usr/share/java,/usr/share/confluent-hub-components,/etc/kafka-connect,/etc/kafka-connect/jar"

RUN confluent-hub install --no-prompt snowflakeinc/snowflake-kafka-connector:1.5.1 \
 && confluent-hub install --no-prompt confluentinc/kafka-connect-jdbc:10.0.1 \
 && update-ca-certificates

# Create plugin directory
RUN mkdir -p /usr/share/java/plugins \
&& mkdir -p /usr/share/java/kafka-connect-jdbc \
&& mkdir -p /etc/kafka-connect/kafka-logs 

#install vim and update 
RUN sed -i 's;http://archive.debian.org/debian/;http://deb.debian.org/debian/;' /etc/apt/sources.list \
   && apt-get update \
   && apt-get install unzip \
   && apt-get install zip \
   && apt-get --yes --force-yes install -y --no-install-recommends apt-utils \
   vim

# Copy config and certs
COPY app/log4j.properties /etc/kafka/log4j.properties
COPY app/connect-log4j.properties /etc/kafka/connect-log4j.properties
COPY .build/certs/*.crt /usr/local/share/ca-certificates/
COPY app/connect-distributed.properties /etc/kafka-connect/connect-distributed.properties
COPY app/start.sh /etc/kafka-connect/start.sh
COPY app/start_test.sh /etc/kafka-connect/start_test.sh
COPY app/setup-certs.sh /etc/kafka-connect/setup-certs.sh

RUN chmod +x /etc/kafka/log4j.properties \
&& chmod +x /etc/kafka/connect-log4j.properties \
&& chmod +x /etc/kafka-connect/start.sh \
&& chmod +x /etc/kafka-connect/start_test.sh \
&& chmod +x /etc/kafka-connect/setup-certs.sh \
&& chmod +x /etc/kafka-connect/connect-distributed.properties


CMD ["/etc/kafka-connect/start.sh"]

This is the Connect.distributed.properties file

{
  "name":"KafkaSinkConnectortoSnowflakes",
  "config":{
    "connector.class":"com.snowflake.kafka.connector.SnowflakeSinkConnector",
    "tasks.max":"8",
    "topics":"",
    "buffer.count.records":"10000",
    "buffer.flush.time":"60",
    "buffer.size.bytes":"5000000",
    "snowflake.url.name":"",
    "snowflake.user.name":"",
    "snowflake.private.key":"",
    "snowflake.private.key.passphrase":"",
    "snowflake.database.name":"",
    "snowflake.schema.name":"PUBLIC",
    "key.converter":"org.apache.kafka.connect.storage.StringConverter",
    "value.converter":"com.snowflake.kafka.connector.records.SnowflakeJsonConverter",
      }
   }

I am getting App Crash Error on Heroku. at=error code=H10 desc="App crashed" method=POST path="/connectors" host=xxxx-test.herokuapp.com request_id=609d1ee1-fc0a-f2e0-e4d7-f5fdd0ed369d fwd="" dyno= connect=0ms service=0ms status=503 bytes=561 protocol=http tls_version=tls1.2

I’ve not tried running Kafka Connect on Heroku so don’t have a proper answer here - but one observation is that you’ll get a 503 server error from Kafka Connect if you try to call the /connectors endpoint before it’s finished initialising.

Thanks Robin, Usually what can we do to wait for the kafka connect to initialize before the /connectors is called?

This is a bash script that I use for that purpose. It waits until a 200 HTTP code is returned.

echo -e "\n\n=============\nWaiting for Kafka Connect to start listening on localhost ⏳\n=============\n"
while [ $(curl -s -o /dev/null -w %{http_code} http://localhost:8083/connectors) -ne 200 ] ; do
  echo -e "\t" $(date) " Kafka Connect listener HTTP state: " $(curl -s -o /dev/null -w %{http_code} http://localhost:8083/connectors) " (waiting for 200)"
  sleep 5
done