I am trying to figure out the best way to ship logs from Confluent components into Elastic.
Filebeat is the shipper of choice.
My question then is, would all the components such as ksqldb, Rest, Connect, Control Center and so forth be best read into elastic using filebeats kafka module, or is there a better approach?
Basically, i am tryin to avoid flooding Elastic with garbage logs, ie meaningless messages.
“One man’s trash is another man’s treasure” . I think that determining what information to sink into Elastic will be a subjective decision for you depending on your use case. Each component can filter logging based on classes, text filters, etc… If you’re using Confluent components inside of Docker, here is some information on logging configuration: Configure Docker Logging | Confluent Documentation
Generally speaking, it is very common to implement a centralized logging system as you are describing and filebeats → Elastic is a common solution that may work for you.