Kafka logs with filebeat

Hey Guys,

I am trying to figure out the best way to ship logs from Confluent components into Elastic.
Filebeat is the shipper of choice.
My question then is, would all the components such as ksqldb, Rest, Connect, Control Center and so forth be best read into elastic using filebeats kafka module, or is there a better approach?
Basically, i am tryin to avoid flooding Elastic with garbage logs, ie meaningless messages.

Best regards
Christian

@Oelsner

“One man’s trash is another man’s treasure” :wink: . I think that determining what information to sink into Elastic will be a subjective decision for you depending on your use case. Each component can filter logging based on classes, text filters, etc… If you’re using Confluent components inside of Docker, here is some information on logging configuration: Configure Docker Logging | Confluent Documentation

Generally speaking, it is very common to implement a centralized logging system as you are describing and filebeats → Elastic is a common solution that may work for you.

I hope this helps

hey @Oelsner

there are 2 nice blog posts about monitoring Kafka with the ELK stack

hth,
michael

1 Like

Hey guys,
Thanks a lot for your inputs. I will start reading up on it rigth away.
And sorry about the latency in my reply.

Best regards
Oelsner