Hi people!
I need to write Kafka Connect log4j
logs to a Kafka Topic, to be able to access them more easily.
There is this good blog post on log4j
configuration for to write Kafka Connect logs to both stdout
and a file: https://forum.confluent.io/t/kafka-connect-change-log-level-and-write-log-to-file/
Also, there is another log4j
appender type, which is called KafkaAppender
(Log4j – Log4j 2 Appenders).
But after I added its configuration to my connect-log4j.properties
:
log4j.rootLogger=INFO, stdout, connectAppender, connectKafkaAppender
#log4j.rootLogger=INFO, stdout, connectAppender
log4j.logger.org.apache.zookeeper=ERROR
log4j.logger.org.reflections=ERROR
# The `%X{connector.context}` parameter in the layout includes connector-specific and task-specific information
# in the log message, where appropriate. This makes it easier to identify those log messages that apply to a
# specific connector. Simply add this parameter to the log layout configuration below to include the contextual information.
#
#connect.log.pattern=[%d] %p %m (%c:%L)%n
connect.log.pattern=[%d] %p %X{connector.context}%m (%c:%L)%n
# Send the logs to the console.
#
log4j.appender.stdout=org.apache.log4j.ConsoleAppender
log4j.appender.stdout.layout=org.apache.log4j.PatternLayout
log4j.appender.stdout.layout.ConversionPattern=${connect.log.pattern}
# Send the logs to a file, rolling the file at midnight local time. For example, the `File` option specifies the
# location of the log files (e.g. ${kafka.logs.dir}/connect.log), and at midnight local time the file is closed
# and copied in the same directory but with a filename that ends in the `DatePattern` option.
#
log4j.appender.connectAppender=org.apache.log4j.DailyRollingFileAppender
log4j.appender.connectAppender.DatePattern='.'yyyy-MM-dd-HH
log4j.appender.connectAppender.File=${kafka.logs.dir}/connect.log
log4j.appender.connectAppender.layout=org.apache.log4j.PatternLayout
log4j.appender.connectAppender.layout.ConversionPattern=${connect.log.pattern}
# Send the logs to a kafka topic
# there will be no key for the kafka records by default
log4j.appender.connectKafkaAppender=org.apache.kafka.log4jappender.KafkaLog4jAppender
log4j.appender.connectKafkaAppender.brokerList=localhost:9092
log4j.appender.connectKafkaAppender.topic=_connect_log
log4j.appender.connectKafkaAppender.compressionType=none
log4j.appender.connectKafkaAppender.ignoreExceptions=true
log4j.appender.connectKafkaAppender.syncSend=false
log4j.appender.connectKafkaAppender.layout=org.apache.log4j.PatternLayout
log4j.appender.connectKafkaAppender.layout.ConversionPattern=${connect.log.pattern}
My Kafka Connect Worker won’t start anymore. The topic _connect_log
exists in my Kafka Cluster, and it is empty.
The only Worker’s output, which I see in my console is as following:
[2021-05-04 10:25:00,641] INFO ProducerConfig values:
acks = 1
batch.size = 16384
bootstrap.servers = [localhost:9092]
buffer.memory = 33554432
client.dns.lookup = use_all_dns_ips
client.id = producer-1
compression.type = none
connections.max.idle.ms = 540000
delivery.timeout.ms = 120000
enable.idempotence = false
interceptor.classes = []
internal.auto.downgrade.txn.commit = false
key.serializer = class org.apache.kafka.common.serialization.ByteArraySerializer
linger.ms = 0
max.block.ms = 60000
max.in.flight.requests.per.connection = 5
max.request.size = 1048576
metadata.max.age.ms = 300000
metadata.max.idle.ms = 300000
metric.reporters = []
metrics.num.samples = 2
metrics.recording.level = INFO
metrics.sample.window.ms = 30000
partitioner.class = class org.apache.kafka.clients.producer.internals.DefaultPartitioner
receive.buffer.bytes = 32768
reconnect.backoff.max.ms = 1000
reconnect.backoff.ms = 50
request.timeout.ms = 30000
retries = 2147483647
retry.backoff.ms = 100
sasl.client.callback.handler.class = null
sasl.jaas.config = null
sasl.kerberos.kinit.cmd = /usr/bin/kinit
sasl.kerberos.min.time.before.relogin = 60000
sasl.kerberos.service.name = null
sasl.kerberos.ticket.renew.jitter = 0.05
sasl.kerberos.ticket.renew.window.factor = 0.8
sasl.login.callback.handler.class = null
sasl.login.class = null
sasl.login.refresh.buffer.seconds = 300
sasl.login.refresh.min.period.seconds = 60
sasl.login.refresh.window.factor = 0.8
sasl.login.refresh.window.jitter = 0.05
sasl.mechanism = GSSAPI
security.protocol = PLAINTEXT
security.providers = null
send.buffer.bytes = 131072
socket.connection.setup.timeout.max.ms = 127000
socket.connection.setup.timeout.ms = 10000
ssl.cipher.suites = null
ssl.enabled.protocols = [TLSv1.2, TLSv1.3]
ssl.endpoint.identification.algorithm = https
ssl.engine.factory.class = null
ssl.key.password = null
ssl.keymanager.algorithm = SunX509
ssl.keystore.certificate.chain = null
ssl.keystore.key = null
ssl.keystore.location = null
ssl.keystore.password = null
ssl.keystore.type = JKS
ssl.protocol = TLSv1.3
ssl.provider = null
ssl.secure.random.implementation = null
ssl.trustmanager.algorithm = PKIX
ssl.truststore.certificates = null
ssl.truststore.location = null
ssl.truststore.password = null
ssl.truststore.type = JKS
transaction.timeout.ms = 60000
transactional.id = null
value.serializer = class org.apache.kafka.common.serialization.ByteArraySerializer
(org.apache.kafka.clients.producer.ProducerConfig:0)
Usually, there is much more output on Worker’s startup. And if I remove the connectKafkaAppender
related configuration from my connect-log4j.properties
, the Worker starts normally and serves requests.
Is there anything I am missing or doing wrong?
May the 4th be with you!