Problem with the activate SSL security

Hi,

I have a broker with Confluent Platform 7.0 installed on a remote server based on Ubuntu.
I try to activate SSL security according this tutorial (related to Confluent Platform)

https://docs.confluent.io/platform/current/security/security_tutorial.html#security-tutorial

and this more generally related only to Kafka:

Apache Kafka.

When I try to check if all it is ok with this command:

 openssl s_client -debug -connect localhost:9093 -tls1

I receive this error:

140274783987520:error:0200206F:system library:connect:Connection refused:../crypto/bio/b_sock2.c:110:
140274783987520:error:2008A067:BIO routines:BIO_connect:connect error:../crypto/bio/b_sock2.c:111:
connect:errno=111

What it means? Can someone help me?
I need to activate SSL to have secure communication from producer-broker and broker-consumer and producers and consumers I need they authenticate for write on a topic or to read from a topic.
Any help is apreciated.
Thanks.

Hi @GiuseppeR

looks like something is blocking the connection.

do you run the command

from the host where the Kafka broker is running?

Best,
Michael

Hi @mmuehlbeyer,

yes I’m trying this command on the same host where is running the Kafka broker.
Thanks…

hmm strange

what does netstat -anp | grep 9093 say?

Hi,

the suggested command doesn’t show anything.
Today Confluent platform doesn’t start…kafka doesn’t start. In the log I have this error:

[2021-11-11 08:30:38,862] ERROR Exiting Kafka due to fatal exception (kafka.Kafka$)
java.lang.ClassNotFoundException: kafka.security.auth.AclAuthorizer
        at java.base/jdk.internal.loader.BuiltinClassLoader.loadClass(BuiltinClassLoader.java:581)
        at java.base/jdk.internal.loader.ClassLoaders$AppClassLoader.loadClass(ClassLoaders.java:178)
        at java.base/java.lang.ClassLoader.loadClass(ClassLoader.java:522)
        at java.base/java.lang.Class.forName0(Native Method)
        at java.base/java.lang.Class.forName(Class.java:398)
        at org.apache.kafka.common.utils.Utils.loadClass(Utils.java:418)
        at org.apache.kafka.common.utils.Utils.newInstance(Utils.java:407)
        at kafka.security.authorizer.AuthorizerUtils$.createAuthorizer(AuthorizerUtils.scala:30)
        at kafka.server.KafkaConfig.<init>(KafkaConfig.scala:2389)
        at kafka.server.KafkaConfig.<init>(KafkaConfig.scala:2187)
        at kafka.Kafka$.buildServer(Kafka.scala:66)
        at kafka.Kafka$.main(Kafka.scala:86)
        at kafka.Kafka.main(Kafka.scala)

Yesterday Confluent started…today not…May it possible SSL it makes unstable the platform?
I’m thinking to use Apache Kafka without Confluent platform.
THX.
g

I guess it’s related to a config issue

zookeeper is running without any errors/warnings?

maybe it’s worth to take one step back and start without security to make sure everything is up and running and then start over with adding security

Thank you Michael for your support…
Yesterday a fresh Confluent platform works correctly.
May be it is a problem of config in properties file.
I try a new installation of Confluent and I follow for the second time the tutorial:

https://docs.confluent.io/platform/current/security/security_tutorial.html#security-tutorial

Hi,

I made a fresh Confluent platform installation…after some solved problems to start all components, I have a Confluent platform running.
I follow the tutorial for security:

https://docs.confluent.io/platform/current/security/security_tutorial.html#security-tutorial

when I try to verify if SSL is active using this command:

openssl s_client -debug -connect localhost:9093 -tls1

I receive this errors:

140526638126912:error:0200206F:system library:connect:Connection refused:../crypto/bio/b_sock2.c:110:
140526638126912:error:2008A067:BIO routines:BIO_connect:connect error:../crypto/bio/b_sock2.c:111:
connect:errno=111

If I try to stop and restart the Confluent platform with the command:

confluent local services stop

I receive the error on control center:

Stopping Control Center
Error: Control Center failed to stop

So, I have two questions.

  • why Confluent platforms doesn’t stop?
  • When Confluent platform with SSL is up in the browser Control center is not loaded: unable to connetct.
  • why autentication, security are not active? There is some wrong configuration?

These are my config files:

server.properties

############## Server Basics

######## The id of the broker. This must be set to a unique integer for each broker.
#broker.id=0
broker.id.generation.enable=true

############## Socket Server Settings

####### The address the socket server listens on. It will get the value returned from

listeners=SSL://:9093,SASL_SSL://:9094
security.inter.broker.protocol=SSL
ssl.client.auth=required

ssl.truststore.location=/var/ssl/private/kafka.server.truststore.jks
ssl.truststore.password=mypassword
ssl.keystore.location=/var/ssl/private/kafka.server.keystore.jks
ssl.keystore.password=mypassword

sasl.enabled.mechanisms=PLAIN

metric.reporters=io.confluent.metrics.reporter.ConfluentMetricsReporter
confluent.metrics.reporter.security.protocol=SASL_SSL
confluent.metrics.reporter.ssl.truststore.location=/var/ssl/private/kafka.server.truststore.jks
confluent.metrics.reporter.ssl.truststore.password=mypassword
confluent.metrics.reporter.sasl.mechanism=PLAIN
confluent.metrics.reporter.sasl.jaas.config=org.apache.kafka.common.security.plain.PlainLoginModule required
username=“kafka-broker-metric-reporter”
password=“mypassword”;

authorizer.class.name=kafka.security.auth.AclAuthorizer
super.users=User:kafka

listener.name.sasl_ssl.plain.sasl.jaas.config=org.apache.kafka.common.security.plain.PlainLoginModule required
username=“admin”
password=“mypassword”
user_admin=“admin-giuseppe”
user_kafkabroker1=“kafkabroker”;

listeners=SASL_SSL://localhost:9092
advertised.listeners=SASL_SSL://localhost:9092

sasl.enabled.mechanisms=PLAIN
sasl.mechanism.inter.broker.protocol=PLAIN
security.inter.broker.protocol=SASL_PLAINTEXT
ssl.endpoint.identification.algorithm=HTTPS

authorizer.class.name=kafka.security.auth.SimpleAclAuthorizer

allow.everyone.if.no.acl.found=true

######## The number of threads that the server uses for receiving requests from the network and sending responses to the network
num.network.threads=3

######## The number of threads that the server uses for processing requests, which may include disk I/O
num.io.threads=8

######## The send buffer (SO_SNDBUF) used by the socket server
socket.send.buffer.bytes=102400

######## The receive buffer (SO_RCVBUF) used by the socket server
socket.receive.buffer.bytes=102400

######## The maximum size of a request that the socket server will accept (protection against OOM)
socket.request.max.bytes=104857600

#################### Log Basics

log.dirs=/tmp/kafka-logs

num.partitions=1

num.recovery.threads.per.data.dir=1

################## Internal Topic Settings

offsets.topic.replication.factor=1
transaction.state.log.replication.factor=1
transaction.state.log.min.isr=1

log.retention.hours=168

log.segment.bytes=1073741824

log.retention.check.interval.ms=300000

zookeeper.connect=localhost:2181

zookeeper.connection.timeout.ms=18000

group.initial.rebalance.delay.ms=0
confluent.license.topic.replication.factor=1
confluent.metadata.topic.replication.factor=1

confluent.security.event.logger.exporter.kafka.topic.replicas=1
confluent.balancer.enable=true


connect-distributed.properties

status.storage.topic=connect-status
status.storage.replication.factor=1

plugin.path=/usr/share/java

security.protocol=SASL_SSL
ssl.truststore.location=/var/ssl/private/kafka.client.truststore.jks
ssl.truststore.password=mypassword
sasl.mechanism=PLAIN
sasl.jaas.config=org.apache.kafka.common.security.plain.PlainLoginModule required
username=“station”
password=“mypassword”;


zookeeper.properties

dataDir=/tmp/zookeeper
clientPort=2181

maxClientCnxns=0

admin.enableServer=false

tickTime=2000
dataDir=/var/lib/zookeeper/
clientPort=2181
initLimit=5
syncLimit=2
server.1=localhost:2888:3888

####### # AUtenticazione x ssl
authProvider.sasl=org.apache.zookeeper.server.auth.SASLAuthenticationProvider

ssl.truststore.location=/var/ssl/private/kafka.server.truststore.jks
ssl.truststore.password=mypassword
ssl.keystore.location=/var/ssl/private/kafka.server.keystore.jks
ssl.keystore.password=mypassword
ssl.key.password=mypassword

authProvider.1=org.apache.zookeeper.server.auth.SASLAuthenticationProvider
requireClientAuthScheme=sasl


kafka_server_jaas.conf

kafkaKafkaServer {
org.apache.kafka.common.security.plain.PlainLoginModule required
username=“kafkabroker”
password=“mypassword”
user_kafkabroker=“kafkabroker-giuseppe”
user_kafka-broker-metric-reporter=“kafkabroker-metric-reporter-giuseppe”
user_client=“stations”;
};

Client {
org.apache.zookeeper.server.auth.DigestLoginModule required
username=“kafka-giuseppe”
password=“mypassword”;
};


zookeeper_jaas.conf

Server {
org.apache.zookeeper.server.auth.DigestLoginModule required
user_super=“admin”
user_kafka=“kafka-giuseppe”;
};


control-center.properties

confluent.controlcenter.data.dir=/tmp/confluent/control-center

ssl.truststore.location=/var/private/ssl/server.truststore.jks
ssl.truststore.password=mypassword
ssl.keystore.location=/var/private/ssl/server.keystore.jks
ssl.keystore.password=mypassword
ssl.key.password=mypassword

security.protocol=SASL_SSL
ssl.truststore.location=/var/ssl/private/kafka.client.truststore.jks
ssl.truststore.password=mypassword
sasl.mechanism=PLAIN
sasl.jaas.config=org.apache.kafka.common.security.plain.PlainLoginModule required
username=“station”
password=“mypassword”;

confluent.controlcenter.ui.autoupdate.enable=true

confluent.controlcenter.usage.data.collection.enable=true


ksql-server.properties

########## HTTP ###
listeners=http://0.0.0.0:8088
listeners=http://localhost:8088

advertised.listener=localhost

ssl.truststore.location=/var/private/ssl/server.truststore.jks
ssl.truststore.password=mypassword
ssl.keystore.location=/var/private/ssl/server.keystore.jks
ssl.keystore.password=mypassword
ssl.key.password=mypassword

ksql.logging.processing.topic.auto.create=true

ksql.logging.processing.stream.auto.create=true

bootstrap.servers=localhost:9092

compression.type=snappy

ksql.schema.registry.url=http://localhost:8081


schema-registry.properties

listeners=http://0.0.0.0:8081

kafkastore.bootstrap.servers=PLAINTEXT://localhost:9092

kafkastore.topic=_schemas

debug=false

security.protocol=SASL_SSL
ssl.truststore.location=/var/ssl/private/kafka.client.truststore.jks
ssl.truststore.password=mypassword
sasl.mechanism=PLAIN
sasl.jaas.config=org.apache.kafka.common.security.plain.PlainLoginModule required
username=“station”
password=“mypassword”;

ssl.truststore.location=/var/private/ssl/server.truststore.jks
ssl.truststore.password=mypassword
ssl.keystore.location=/var/private/ssl/server.keystore.jks
ssl.keystore.password=mypassword
ssl.key.password=mypassword

schema.registry.ssl.truststore.location=/var/private/ssl/server.truststore.jks
schema.registry.ssl.truststore.password=mypassword
schema.registry.ssl.keystore.location=/var/private/ssl/server.keystore.jks
schema.registry.ssl.keystore.password=mypassword
schema.registry.ssl.key.password=mypassword

Some hint may be went from this command:

sudo /home/kafka/Documents/confluent/bin/kafka-server-start /home/kafka/Documents/confluent/etc/kafka/server.properties
[2021-11-11 17:36:20,236] INFO Registered kafka:type=kafka.Log4jController MBean (kafka.utils.Log4jControllerRegistration$)
[2021-11-11 17:36:20,574] INFO Setting -D jdk.tls.rejectClientInitiatedRenegotiation=true to disable client-initiated TLS renegotiation (org.apache.zookeeper.common.X509Util)
[2021-11-11 17:36:20,602] ERROR Exiting Kafka due to fatal exception (kafka.Kafka$)
java.lang.IllegalArgumentException: requirement failed: inter.broker.listener.name must be a listener name defined in advertised.listeners. The valid options based on currently configured listeners are SASL_SSL
at kafka.server.KafkaConfig.validateValues(KafkaConfig.scala:2495)
at kafka.server.KafkaConfig.(KafkaConfig.scala:2470)
at kafka.server.KafkaConfig.(KafkaConfig.scala:1902)
at kafka.server.KafkaServerStartable$.fromProps(KafkaServerStartable.scala:34)
at kafka.Kafka$.main(Kafka.scala:68)
at kafka.Kafka.main(Kafka.scala)

If some configuration is not present because I leave the standard value.
Hoping this can help to activate SSL on my Confluent platform.
Thanks.
g

Hi,

thanks for the config files and your description, will check this the next days.

did you check the logfiles?
any errors there?

how did you start the whole stack?
according to your notes you’ve started the broker with kafka-server-start but also used confluent local services start?

Is my understanding correct?
If yes, I think this won’t work properly as confluent local services startnormally
creates a temp file location where it starts an “isolated” envrionment.

Hi @mmuehlbeyer

thank you for your precious help.
I start/stop the stack with the command:

confluent local services start/stop

I use the command:

export KAFKA_OPTS="-Djava.security.auth.login.config=etc/kafka/zookeeper_jaas.conf"
bin/zookeeper-server-start etc/kafka/zookeeper.properties

according the tutorial…but I understand if I make something wrong or I skip some passage.
Today any is not working: kafka component doesn’t start. In the/tmp/confluent/kafka/logs/server.log file I have:

[2021-11-12 08:31:36,006] INFO Registered kafka:type=kafka.Log4jController MBean (kafka.utils.Log4jControllerRegistration$)
[2021-11-12 08:31:36,523] INFO Setting -D jdk.tls.rejectClientInitiatedRenegotiation=true to disable client-initiated TLS renegotiation (org.apache.zookeeper.common.X509Util)
[2021-11-12 08:31:36,582] ERROR Exiting Kafka due to fatal exception (kafka.Kafka$)
java.lang.IllegalArgumentException: requirement failed: inter.broker.listener.name must be a listener name defined in advertised.listeners. The valid options based on currently configured listeners are SASL_SSL
        at kafka.server.KafkaConfig.validateValues(KafkaConfig.scala:2495)
        at kafka.server.KafkaConfig.<init>(KafkaConfig.scala:2470)
        at kafka.server.KafkaConfig.<init>(KafkaConfig.scala:1902)
        at kafka.server.KafkaServerStartable$.fromProps(KafkaServerStartable.scala:34)
        at kafka.Kafka$.main(Kafka.scala:68)
        at kafka.Kafka.main(Kafka.scala)

and it seeems the same error posted yesterday.
Hoping this can help to solve my problems.
Thanks.
g

Hi @GiuseppeR

seems your mixing to ways of starting the stack.
My recommendation is to not to use the confluent local services start but the kafka cli commands as described here:

https://docs.confluent.io/platform/current/installation/installing_cp/zip-tar.html#start-cp

I would start with zookeeper and broker only to first see if everything is working as expected.
I did a short description some time ago (without security enabled ) hope it’s helpful anyway

best,
michael