Confluent Kafka as a logs caching server


I want to use confluent kafka for some of my logs handling process. I have around 500 GB logs data to cache in the confluent kafka. However I am planning to use community version. May I check with you, is there any limitations to caching data ? can I know what are the differences between community version and the commercial version ?


Hi Luke,

regarding the differences have a look here:


1 Like

Hi Michael,

Thanks for the info. Really helps to understand the differences.

If I start confluent-Kafka with its community version, is there any limitations to store data as a cache? say for example I have 50 GB data to cache every day. is it achievable?


Hi @luke_devon

basically you’re limited by the power of your underlying hardware.

to get a rough estimate for your cluster you could also try the sizing calculator made by confluent

Some further reading about amount of partitions per cluster:

just to be clear with the naming of Kafka and confluent platform :wink:
have a look at