Hi,
the next step in my cluster setup is SSL.
I modified our old clusters files to use the new one and everything seemed to work just fine, except that the client application could not access new new Kafka cluster with the less then helpful message
[AdminClient clientId=adminclient-1] Connection to node -3 (host/ip:19093) terminated during authentication. This may happen due to any of the following reasons: (1) Authentication failed due to invalid credentials with brokers older than 1.0.0, (2) Firewall blocking Kafka TLS traffic (eg it may only allow HTTPS traffic), (3) Transient network issue.
Of course network is fine…
Anyhow I played around a bit trying to connect to the borker directly to create a topic when I realised that that also didnt work - even locally inside the kafka broker container (same message).
I then found that apparently i did not actiavte SSL properly, so all SSL enabled conneciton attempts fail.
I then started trying to find out why enableing SSL didnt work and I found that evidently the Listener Names are not only names but do have a meaning, so
this enables SSL
-e KAFKA_LISTENERS=‘SSL://localhost:29094,EXTERNAL://localhost:19091’
-e KAFKA_LISTENER_SECURITY_PROTOCOL_MAP=‘CONTROLLER:SSL,SSL:SSL,EXTERNAL:SSL’ \
but
-e KAFKA_LISTENERS=‘SSL_LISTENER://localhost:29094,EXTERNAL://localhost:19091’
-e KAFKA_LISTENER_SECURITY_PROTOCOL_MAP=‘CONTROLLER:SSL,SSL_LISTENER:SSL,EXTERNAL:SSL’ \
does not.
Weird, but ok.
Now I see that SSL is enabled, but i cannot get it to work, instead of using the key in keystore location it keeps on asking about the keystore_filename
podman logs kafka-1 ===> User uid=1000(appuser) gid=1000(appuser) groups=1000(appuser) ===> Configuring ... Running in KRaft mode... SSL is enabled. KAFKA_SSL_KEYSTORE_FILENAME is required. Command [/usr/local/bin/dub ensure KAFKA_SSL_KEYSTORE_FILENAME] FAILED !
If I provide that it complains it cannot find it since its searching in the wrong path (not the kafka_keystore_location i provided).
podman logs kafka-1
===> User
uid=1000(appuser) gid=1000(appuser) groups=1000(appuser)
===> Configuring ...
Running in KRaft mode...
SSL is enabled.
Command [/usr/local/bin/dub path /etc/kafka/secrets/host1.pfx exists] FAILED !
I can trick him to find it by pointing to the correct file in the container, but then its just asking for KAFKA_SSL_KEY_CREDENTIALS next.
It’s supposed to use the key in the keystore and I think I provided everything it needs, but it does not seem to take it
podman run -d \
--name kafka-1 \
-h=host1 \
-p 19091:19091 \
-p 29094:29094 \
-v $humio_working_dir/data/cpkafka-broker:/data/cpkafka-data:Z \
-v $humio_working_dir/keystore:/keystore:z \
--secret=KAFKA_ssl_keystore_password,type=env,target=KAFKA_ssl_keystore_password \
--secret=KAFKA_ssl_truststore_password,type=env,target=KAFKA_ssl_truststore_password \
-e KAFKA_LISTENERS='SSL://localhost:29094,EXTERNAL://localhost:19091' \
-e KAFKA_LISTENER_SECURITY_PROTOCOL_MAP='CONTROLLER:SSL,SSL:SSL,EXTERNAL:SSL' \
-e KAFKA_ADVERTISED_LISTENERS='SSL://host1:29094,EXTERNAL://host1:19091' \
-e KAFKA_INTER_BROKER_LISTENER_NAME='SSL' \
-e KAFKA_GROUP_INITIAL_REBALANCE_DELAY_MS=0 \
-e KAFKA_BROKER_RACK='rack-0' \
-e KAFKA_LOG_DIRS='/data/cpkafka-data' \
-e KAFKA_MIN_INSYNC_REPLICAS=2 \
-e KAFKA_CONFLUENT_LICENSE_TOPIC_REPLICATION_FACTOR=2 \
-e KAFKA_CONFLUENT_CLUSTER_LINK_ENABLE='true' \
-e KAFKA_CONFLUENT_REPORTERS_TELEMETRY_AUTO_ENABLE='false' \
-e KAFKA_NODE_ID=4 \
-e CLUSTER_ID='<clusterid>' \
-e KAFKA_CONTROLLER_QUORUM_VOTERS='1@host1:29091,2@host2:29092,3@host3:29093' \
-e KAFKA_PROCESS_ROLES='broker' \
-e KAFKA_CONTROLLER_LISTENER_NAMES='CONTROLLER' \
-e KAFKA_SSL_KEYSTORE_LOCATION='/keystore/host1.pfx' \
-e KAFKA_ssl_keyStore_type='PKCS12' \
-e KAFKA_ssl_truststore_location='/keystore/truststore.pfx' \
-e KAFKA_ssl_trustStore_type='PKCS12' \
-e KAFKA_ssl_client_auth='requested' \
-e KAFKA_LOG4J_ROOT_LOGLEVEL="DEBUG" \
-e KAFKA_LOG4J_TOOLS_LOGLEVEL=ERROR \
-e KAFKA_LOG4J_LOGGERS="kafka=DEBUG,kafka.controller=WARN,kafka.log.LogCleaner=WARN,state.change.logger=WARN,kafka.producer.async.DefaultEventHandler=WARN" \
confluentinc/cp-kafka:latest
(Keystore/Truststore passwords are passed in via secrets)
So, a couple of questions
- What are the required parameters to tun on SSL? Am I missing any?
- If not, why does it not take them?
- I tried getting more info via debug logging, but its not helping at all (no change to output). What is the proper way of getting debug messages for this? The default is rater useless
Thanks