Hi everyone,
Has anyone got experience of running Kafka Backup with a cluster secured using SSL and Kerberos? Is it possible to do that?
I’m struggling a bit trying to get the security settings working for Kafka Backup, yet have no problem connecting a producer or consumer to the cluster.
If anyone could point me in the right direction, I’d be really grateful
hey mark
did you check the github issues?
there are some examples, I think the do not fit you’re env 100% but maybe a good point to start
opened 10:20AM - 17 Jan 20 UTC
documentation
help wanted
* [ ] Plaintext
* [ ] TLS
opened 09:39AM - 17 Jan 20 UTC
closed 10:20AM - 17 Jan 20 UTC
It's not clear to me how it connect kafka cluster which requires SSL.
For now… I'm playing with simple case. I'm starting kafka backup sink in standalone connector right on kafka cluster node. kafka-connect is starting fine, AdminClient is connecting well. Though backup sink cannot connect. Consumer config part still show `security.protocol = PLAINTEXT` during initialisation stage.
So far I tried this in my connect-backup-sink.properties:
```
cluster.bootstrap.servers=kafka5.tld:9093
cluster.security.protocol=SSL
cluster.ssl.truststore.type=PKCS12
cluster.ssl.truststore.location=/opt/kafka/config/ssl/truststore.pkcs12
cluster.ssl.truststore.password=[CENSORED]
#
bootstrap.servers=kafka5.tld:9093
security.protocol=SSL
ssl.truststore.type=PKCS12
ssl.truststore.location=/opt/kafka/config/ssl/truststore.pkcs12
ssl.truststore.password=[CENSORED]
#
consumer.bootstrap.servers=kafka5.tld:9093
consumer.security.protocol=SSL
consumer.ssl.truststore.type=PKCS12
consumer.ssl.truststore.location=/opt/kafka/config/ssl/truststore.pkcs12
consumer.ssl.truststore.password=[CENSORED]
#
cluster.consumer.bootstrap.servers=kafka5.tld:9093
cluster.consumer.security.protocol=SSL
cluster.consumer.ssl.truststore.type=PKCS12
cluster.consumer.ssl.truststore.location=/opt/kafka/config/ssl/truststore.pkcs12
cluster.consumer.ssl.truststore.password=[CENSORED]
```
Am I missing something?
Though I still see this in logs:
```
[2020-01-17 09:36:38,028] INFO ConsumerConfig values:
allow.auto.create.topics = true
auto.commit.interval.ms = 5000
auto.offset.reset = earliest
bootstrap.servers = [kafka5.tld:9093]
check.crcs = true
client.dns.lookup = default
client.id = connector-consumer-chrono-qa-backup-sink-0
client.rack =
connections.max.idle.ms = 540000
default.api.timeout.ms = 60000
enable.auto.commit = false
exclude.internal.topics = true
fetch.max.bytes = 52428800
fetch.max.wait.ms = 500
fetch.min.bytes = 1
group.id = connect-chrono-qa-backup-sink
group.instance.id = null
heartbeat.interval.ms = 3000
interceptor.classes = []
internal.leave.group.on.close = true
isolation.level = read_uncommitted
key.deserializer = class org.apache.kafka.common.serialization.ByteArrayDeserializer
max.partition.fetch.bytes = 1048576
max.poll.interval.ms = 300000
max.poll.records = 500
metadata.max.age.ms = 300000
metric.reporters = []
metrics.num.samples = 2
metrics.recording.level = INFO
metrics.sample.window.ms = 30000
partition.assignment.strategy = [class org.apache.kafka.clients.consumer.RangeAssignor]
receive.buffer.bytes = 65536
reconnect.backoff.max.ms = 1000
reconnect.backoff.ms = 50
request.timeout.ms = 30000
retry.backoff.ms = 100
sasl.client.callback.handler.class = null
sasl.jaas.config = null
sasl.kerberos.kinit.cmd = /usr/bin/kinit
sasl.kerberos.min.time.before.relogin = 60000
sasl.kerberos.service.name = null
sasl.kerberos.ticket.renew.jitter = 0.05
sasl.kerberos.ticket.renew.window.factor = 0.8
sasl.login.callback.handler.class = null
sasl.login.class = null
sasl.login.refresh.buffer.seconds = 300
sasl.login.refresh.min.period.seconds = 60
sasl.login.refresh.window.factor = 0.8
sasl.login.refresh.window.jitter = 0.05
sasl.mechanism = GSSAPI
security.protocol = PLAINTEXT
security.providers = null
send.buffer.bytes = 131072
session.timeout.ms = 10000
ssl.cipher.suites = null
ssl.enabled.protocols = [TLSv1.2, TLSv1.1, TLSv1]
ssl.endpoint.identification.algorithm = https
ssl.key.password = null
ssl.keymanager.algorithm = SunX509
ssl.keystore.location = null
ssl.keystore.password = null
ssl.keystore.type = JKS
ssl.protocol = TLS
ssl.provider = null
ssl.secure.random.implementation = null
ssl.trustmanager.algorithm = PKIX
ssl.truststore.location = null
ssl.truststore.password = null
ssl.truststore.type = JKS
value.deserializer = class org.apache.kafka.common.serialization.ByteArrayDeserializer
(org.apache.kafka.clients.consumer.ConsumerConfig:347)
```
AdminClient is fine though:
```
[2020-01-17 09:36:38,114] INFO AdminClientConfig values:
bootstrap.servers = [kafka5.tld:9093]
client.dns.lookup = default
client.id =
connections.max.idle.ms = 300000
metadata.max.age.ms = 300000
metric.reporters = []
metrics.num.samples = 2
metrics.recording.level = INFO
metrics.sample.window.ms = 30000
receive.buffer.bytes = 65536
reconnect.backoff.max.ms = 1000
reconnect.backoff.ms = 50
request.timeout.ms = 120000
retries = 5
retry.backoff.ms = 100
sasl.client.callback.handler.class = null
sasl.jaas.config = null
sasl.kerberos.kinit.cmd = /usr/bin/kinit
sasl.kerberos.min.time.before.relogin = 60000
sasl.kerberos.service.name = null
sasl.kerberos.ticket.renew.jitter = 0.05
sasl.kerberos.ticket.renew.window.factor = 0.8
sasl.login.callback.handler.class = null
sasl.login.class = null
sasl.login.refresh.buffer.seconds = 300
sasl.login.refresh.min.period.seconds = 60
sasl.login.refresh.window.factor = 0.8
sasl.login.refresh.window.jitter = 0.05
sasl.mechanism = GSSAPI
security.protocol = SSL
security.providers = null
send.buffer.bytes = 131072
ssl.cipher.suites = null
ssl.enabled.protocols = [TLSv1.2, TLSv1.1, TLSv1]
ssl.endpoint.identification.algorithm = https
ssl.key.password = null
ssl.keymanager.algorithm = SunX509
ssl.keystore.location = null
ssl.keystore.password = null
ssl.keystore.type = JKS
ssl.protocol = TLS
ssl.provider = null
ssl.secure.random.implementation = null
ssl.trustmanager.algorithm = PKIX
ssl.truststore.location = /opt/kafka/config/ssl/truststore.pkcs12
ssl.truststore.password = [hidden]
ssl.truststore.type = PKCS12
(org.apache.kafka.clients.admin.AdminClientConfig:347)
```
opened 01:59PM - 05 Jun 20 UTC
closed 06:54PM - 06 Jun 20 UTC
question
Hi,
Have you example of a docker-compose to execute Kafka-Backup in SSL mode ?
…
Thanks
Hey,
Thank you. That first one does look very similar to what I have in my configuration, but it does give me a point to compare from. I’ll go through that and then see what I may have missed.
hey,
great, would be if you keep me posted
Hey,
Thanks for your help…it did help me to resolve the issue. In the end, it was down to the properties files that the scripts created. They were missing some of the SASL properties. So I ended up creating them manually and then running backup and restore just using connect-standalone. Looks like its working fine now
Once again, thank you for pointing me in the right direction. I had seen those pages before, but not really grasped how useful they actually were.
2 Likes