Kafka Connect - Install additional log4j appenders

Hi hallo and welcome!

Given
Kafka Connect in Docker(confluentinc/cp-kafka-connect) with multiple connectors installed inside: mongodb/kafka-connect-mongodb and debezium/debezium-connector-mongodb.

Task
I wanna install an additional log4j appender to write logs into ElasticSearch

I’ve never worked with Java stack so it’s a bit tough for me.

What I’ve done

  1. Found the appender in Maven: https://mvnrepository.com/artifact/org.appenders.log4j/log4j2-elasticsearch-hc/1.5.5
  2. Downloaded jar file and built it in Docker image to /usr/share/java/ path (CONNECT_PLUGIN_PATH).
  3. Modified /etc/confluent/docker/log4j.properties.template.

As a result, the resulting log4j.properties:


log4j.rootLogger=INFO, stdout, elasticSearchAppender

log4j.appender.stdout=org.apache.log4j.ConsoleAppender
log4j.appender.stdout.layout=org.apache.log4j.PatternLayout
log4j.appender.stdout.layout.ConversionPattern =[%d] %p %m (%c)%n

# Sends the logs to ElasticSearch.
#
log4j.appender.elasticSearchAppender=org.appenders.log4j2.elasticsearch.hc

# with index 'log4j2-elasticsearch-hc' rolling hourly
log4j.appender.elasticSearchAppender.indexNameFormatter.type = RollingIndexName
log4j.appender.elasticSearchAppender.indexNameFormatter.indexName = kafka-connect
log4j.appender.elasticSearchAppender.indexNameFormatter.pattern = yyyy-MM-dd-HH

# with HC HTTP client
log4j.appender.elasticSearchAppender.batchDelivery.objectFactory.type = HCHttp
log4j.appender.elasticSearchAppender.batchDelivery.objectFactory.serverUris = http://local_elasticsearch:9200
log4j.appender.elasticSearchAppender.batchDelivery.objectFactory.connTimeout = 500
log4j.appender.elasticSearchAppender.batchDelivery.objectFactory.readTimeout = 10000
log4j.appender.elasticSearchAppender.batchDelivery.objectFactory.maxTotalConnections = 8
log4j.appender.elasticSearchAppender.batchDelivery.objectFactory.mappingType = _doc


log4j.appender.elasticSearchAppender.layout.type = JacksonJsonLayout

# default log levels
log4j.logger.org.reflections=ERROR
log4j.logger.org.apache.zookeeper=ERROR
log4j.logger.org.I0Itec.zkclient=ERROR

# Configuring debezium logger,
log4j.logger.io.debezium.connector=DEBUG,elasticSearchAppender

Result:

  1. The connector still works.
  2. During startup, there are logs messages that elastic jarfile is loaded.
  3. log4j.logger.io.debezium.connector=DEBUG,elasticSearchAppender - debezium log level is changed to DEBUG but seems that elasticSearchAppender appender isn’t applied because there’s nothing going on in ElastiCache.

Could you please suggest if there’s anything else I should do?
As I said, I’ve never worked with Java so I might’ve not noticed some hidden traps.

Thanks in advance.

Hi @YurkoUA

is the elasticsearch host available from within the docker container?

did you try to curl and ping the elasticsearch from the docker container itself?

best,
michael

Hi @mmuehlbeyer

Yeah, it’s available.

The containers are both in the same network and curl works.

Just a small update.

Here are the logs of runtime loading the appender.
Look optimistic.

And here’s the appender repo: log4j2-elasticsearch/log4j2-elasticsearch-hc at master · rfoltyns/log4j2-elasticsearch · GitHub

[2022-04-20 18:24:50,141] INFO Loading plugin from: /usr/share/java/log4j2-elasticsearch-hc-1.5.5.jar (org.apache.kafka.connect.runtime.isolation.DelegatingClassLoader)
[2022-04-20 18:24:50,156] INFO Registered loader: PluginClassLoader{pluginLocation=file:/usr/share/java/log4j2-elasticsearch-hc-1.5.5.jar} (org.apache.kafka.connect.runtime.isolation.DelegatingClassLoader)

ok thanks for the information

did you try to switch log4j.rootLogger to DEBUG? just to get more insight.

shameless self plug :wink:

there was a similar challenge sending logs to a kafka topic, it may help

best,
michael

Yesterday I’ve got much closer to the issue.
It started showing additional symptoms.

When I fix it, I’ll describe the instruction here for other devs like me :wink:

2 Likes

This topic was automatically closed 30 days after the last reply. New replies are no longer allowed.