How to save Connect logs to file

Hey all,
How do I send connect logs to files? I want to be able to send them into SumoLogic (logs service).

I’m running connect on AWS ECS (docker) and tried to add the following:

 - CONNECT_LOG4J_ROOT_LOGLEVEL="INFO, FILE"
 - CONNECT_LOG4J_APPENDER_FILE="org.apache.log4j.RollingFileAppender"
 - CONNECT_LOG4J_APPENDER_FILE_FILE="/var/log/kafka/connect_log"
 - CONNECT_LOG4J_APPENDER_FILE_LAYOUT=org.apache.log4j.PatternLayout
 - CONNECT_LOG4J_APPENDER_FILE_LAYOUT_CONVERSIONPATTERN=[%d] %p %m (%c:%L)%n
 - CONNECT_LOG4J_APPENDER_FILE_MAXFILESIZE=10MB
 - CONNECT_LOG4J_APPENDER_FILE_MAXBACKUPINDEX=5
 - CONNECT_LOG4J_APPENDER_FILE_APPEND=true

but getting the following error:

log4j:ERROR Could not find value for key log4j.appender.file"
log4j:ERROR Could not instantiate appender named "file"".

Any ideas on how to fix it?
Thanks

hey @shlomi

did you also adapt your
connect-log4j.properties file?

which image are you using?

Best,
Michael

I’m using confluentinc/cp-kafka-connect-base. If I can I would like to not change the image and only control it with environment variables.
Is that possible?

need to check by myself, keep you posted

but I think one possible solution might be to map the /etc/kafka directory contents to your container

short update:
after a quick scan of the source code of your used docker image, I think the env vars are not evaluated during the startup of the container

but let me check something else

Thanks for the info. I tried to add

COPY connect-log4j.properties /etc/kafka/connect-log4j.properties

to my dockerfile but I’m getting the error:

[Errno 13] Permission denied: '/etc/kafka/connect-log4j.properties'
Command [/usr/local/bin/dub template /etc/confluent/docker/log4j.properties.template /etc/kafka/connect-log4j.properties] FAILED !

Any advice?

how does your dockerfile and docker run command look like?

could you share it with us?

dockerfile:

FROM confluentinc/cp-kafka-connect-base

# Connectors
RUN confluent-hub install --no-prompt confluentinc/kafka-connect-s3:10.0.0
RUN confluent-hub install --no-prompt debezium/debezium-connector-postgresql:1.4.0

COPY set_env.sh /etc/set_env.sh
COPY connect-log4j.properties /etc/kafka/connect-log4j.properties

CMD ["/etc/set_env.sh"]

set_env.sh:

#!/bin/bash

LOCAL_IP=$(hostname -i)

# Create secrets file for connectors
cat > /tmp/postgres.properties <<EOL
db_name=${DB_NAME}
host_name=${HOST_NAME}
db_user=${DB_USER}
db_password=${DB_PASSWORD}
sslmode=${SSLMODE}
EOL

CONNECT_REST_ADVERTISED_HOST_NAME=$LOCAL_IP /etc/confluent/docker/run

hey,

I’ve tested different settings in my local env, not yet a working solution but probably a good point to start.

I’v extended the Dockerfile with the following

COPY common-logging-6.2.1.jar /usr/share/java/kafka/common-logging-6.2.1.jar
COPY confluent-log4j-extensions-6.2.1.jar /usr/share/java/kafka/confluent-log4j-extensions-6.2.1.jar
COPY kafka-connect.properties /etc/confluent/docker/kafka-connect.properties.template

get the jars from
https://mvnrepository.com/artifact/io.confluent/common-logging
https://mvnrepository.com/artifact/io.confluent/confluent-log4j-extensions

and then start the container with
-e CONNECT_LOG4J_ROOT_LOGLEVEL="DEBUG, connectAppender, stdout" \

for reference, similar topic but not on docker:

Did you get to test it? what is in the kafka-connect.properties file?

not yet
need to check tomorrow

keep you posted

no working solution till now

but one thing I would like to drop
some times ago we’ve collected docker logs with fluent and pushed it to S3 with fluent-bit
it might be worth to have a look
https://www.fluentd.org/guides/recipes/docker-logging

This topic was automatically closed 30 days after the last reply. New replies are no longer allowed.