How to save Connect logs to file

Hey all,
How do I send connect logs to files? I want to be able to send them into SumoLogic (logs service).

I’m running connect on AWS ECS (docker) and tried to add the following:

 - CONNECT_LOG4J_APPENDER_FILE="org.apache.log4j.RollingFileAppender"
 - CONNECT_LOG4J_APPENDER_FILE_FILE="/var/log/kafka/connect_log"
 - CONNECT_LOG4J_APPENDER_FILE_LAYOUT=org.apache.log4j.PatternLayout

but getting the following error:

log4j:ERROR Could not find value for key log4j.appender.file"
log4j:ERROR Could not instantiate appender named "file"".

Any ideas on how to fix it?

hey @shlomi

did you also adapt your file?

which image are you using?


I’m using confluentinc/cp-kafka-connect-base. If I can I would like to not change the image and only control it with environment variables.
Is that possible?

need to check by myself, keep you posted

but I think one possible solution might be to map the /etc/kafka directory contents to your container

short update:
after a quick scan of the source code of your used docker image, I think the env vars are not evaluated during the startup of the container

but let me check something else

Thanks for the info. I tried to add

COPY /etc/kafka/

to my dockerfile but I’m getting the error:

[Errno 13] Permission denied: '/etc/kafka/'
Command [/usr/local/bin/dub template /etc/confluent/docker/ /etc/kafka/] FAILED !

Any advice?

how does your dockerfile and docker run command look like?

could you share it with us?


FROM confluentinc/cp-kafka-connect-base

# Connectors
RUN confluent-hub install --no-prompt confluentinc/kafka-connect-s3:10.0.0
RUN confluent-hub install --no-prompt debezium/debezium-connector-postgresql:1.4.0

COPY /etc/
COPY /etc/kafka/

CMD ["/etc/"]


LOCAL_IP=$(hostname -i)

# Create secrets file for connectors
cat > /tmp/ <<EOL



I’ve tested different settings in my local env, not yet a working solution but probably a good point to start.

I’v extended the Dockerfile with the following

COPY common-logging-6.2.1.jar /usr/share/java/kafka/common-logging-6.2.1.jar
COPY confluent-log4j-extensions-6.2.1.jar /usr/share/java/kafka/confluent-log4j-extensions-6.2.1.jar
COPY /etc/confluent/docker/

get the jars from

and then start the container with
-e CONNECT_LOG4J_ROOT_LOGLEVEL="DEBUG, connectAppender, stdout" \

for reference, similar topic but not on docker:

Did you get to test it? what is in the file?

not yet
need to check tomorrow

keep you posted

no working solution till now

but one thing I would like to drop
some times ago we’ve collected docker logs with fluent and pushed it to S3 with fluent-bit
it might be worth to have a look

This topic was automatically closed 30 days after the last reply. New replies are no longer allowed.