Where are Kafka Log Files and Kafka home?

Hello all,

sorry for the noob question, please bear with me, but I could not find answers in the documentation or existing topics.

I am currently setting up some monitoring with the Elastic ELK stack with the goal of having a Dashboard for the Kafka Logs. JMX Metrics with Prometheus and Grafana have worked before but that’s only metrics.

I am using: confluentinc/cp-server:7.6.0 docker image for broker and controller, confluentinc/cp-schema-registry:7.6.0 for schema-registry, confluentinc/cp-server-connect:7.6.0, confluentinc/cp-enterprise-control-center:7.6.0, confluentinc/cp-ksqldb-server:7.6.0, confluentinc/cp-ksqldb-cli:7.6.0, confluentinc/ksqldb-examples:7.6.0 and confluentinc/cp-kafka-rest:7.6.0

However I have trouble finding where log files are located in the containers. Searching through the directories of the containers I was unable to find any .log files or where the Kafka home directory is. Those paths I need for setting up the Elasticsearch Agent to collect those.

After some digging I found that the connect-log4j.properties might be of relevance:


log4j.rootLogger=INFO, stdout

log4j.appender.stdout=org.apache.log4j.ConsoleAppender
log4j.appender.stdout.layout=org.apache.log4j.PatternLayout
log4j.appender.stdout.layout.ConversionPattern =[%d] %p %X{connector.context}%m %X{ohja.location}%n


# loggers from CONNECT_LOG4J_LOGGERS env variable
log4j.logger.log4j.logger.io.confluent.<connector-name>=DEBUG
log4j.logger.org.I0Itec.zkclient=ERROR
log4j.logger.org.reflections=ERROR
log4j.logger.com.sap.conn.jco=DEBUG
log4j.logger.org.init.ohja.kafka.connect=DEBUG

From my understanding this means that at least I can see logs in stdout and also when I do “docker logs broker” I see some Kafka related logs being created. However that does not help much for the Elasticsearch Integration.

Any advice would be greatly appreciated :slight_smile:

The confluentinc/cp-server image is based on Red Hat’s UBI image and Confluent binaries get installed via yum install, which will put scripts in /usr/bin, JARs under /usr/share/java/, etc. So, there isn’t really a single home directory where everything goes. This is the spot in the Dockerfile where the package install happens.

GC logging is written to the file system under /var/log/kafka/, but Log4j logging only goes to stdout so you’d have to extend the image in order to write to a file. The documentation here and here covers how to do that with an example. Instead of only writing authorizer logs to the FS, you’d want to add a file writing appender to the root logger.


When running Kafka with Zookeeper standalone on my PC I can find these files from the attached image in the /log/ folder. However I cannot find these anywhere looking through the confluent containers files and directories.

For connect I have extended the connect-log4j.properties files already according to this to also log to the file /var/log/kafka-connect/connect.log with the environment variable KAFKA_LOG4J_OPTS. This is working as I can see the connect.log file in the container and host machine.
For broker and controller I’m intending to do the same.

However these don’t fully match the files the elasticsearch agent expects I feel like as server.log, statechange.log and kafka-*.log are still a mystery to me.

Right, it only goes to stdout

What does your Dockerfile look like? Note that you have to overwrite /etc/confluent/docker/log4j.properties.template because that is used as a template here during container configuration.

For example, start with the /etc/confluent/docker/log4j.properties.template in the cp-server container and edit it to write to the FS. Save it as log4j.properties.template in the same directory as your Dockerfile:

log4j.rootLogger={{ env["KAFKA_LOG4J_ROOT_LOGLEVEL"] | default('INFO') }}, stdout, file

log4j.appender.stdout=org.apache.log4j.ConsoleAppender
log4j.appender.stdout.layout=org.apache.log4j.PatternLayout
log4j.appender.stdout.layout.ConversionPattern=[%d] %p %m (%c)%n

log4j.appender.file=org.apache.log4j.DailyRollingFileAppender
log4j.appender.file.File=/var/log/confluent/kafka/kafka.log
log4j.appender.file.DatePattern='.'yyyy-MM-dd
log4j.appender.file.layout=org.apache.log4j.PatternLayout
log4j.appender.file.layout.ConversionPattern=%d{yyyy-MM-dd HH:mm:ss} %c{1} [%p] %m%n

{% set loggers = {
  'kafka': 'INFO',
  'kafka.network.RequestChannel$': 'WARN',
  'kafka.producer.async.DefaultEventHandler': 'DEBUG',
  'kafka.request.logger': 'WARN',
  'kafka.controller': 'TRACE',
  'kafka.log.LogCleaner': 'INFO',
  'state.change.logger': 'TRACE',
  'kafka.authorizer.logger': 'WARN'
  } -%}


{% if env['KAFKA_LOG4J_LOGGERS'] %}
{% set loggers = parse_log4j_loggers(env['KAFKA_LOG4J_LOGGERS'], loggers) %}
{% endif %}

{% for logger,loglevel in loggers.items() %}
log4j.logger.{{logger}}={{loglevel}}
{% endfor %}

Then in your Dockerfile make sure that the directory where it’ll write to exists and is writeable, and copy your template into the image:

FROM confluentinc/cp-server:7.6.0

RUN echo "===> Creating log dir ..." \
     && mkdir -p /var/log/confluent/kafka \
     && chmod -R ag+w /var/log/confluent/kafka

COPY log4j.properties.template /etc/confluent/docker/log4j.properties.template

Use that image instead of confluentinc/cp-server:7.6.0 and you’ll see log4j logging show up under /var/log/confluent/kafka.