Confluent Kafka as a logs caching server

Hi,

I want to use confluent kafka for some of my logs handling process. I have around 500 GB logs data to cache in the confluent kafka. However I am planning to use community version. May I check with you, is there any limitations to caching data ? can I know what are the differences between community version and the commercial version ?

Thanks
Luke

Hi Luke,

regarding the differences have a look here:
https://docs.confluent.io/platform/current/installation/license.html#

hth,
michael

1 Like

Hi Michael,

Thanks for the info. Really helps to understand the differences.

If I start confluent-Kafka with its community version, is there any limitations to store data as a cache? say for example I have 50 GB data to cache every day. is it achievable?

Thanks
Luke.

Hi @luke_devon

basically you’re limited by the power of your underlying hardware.

to get a rough estimate for your cluster you could also try the sizing calculator made by confluent https://eventsizer.io/

Some further reading about amount of partitions per cluster:

https://www.confluent.io/blog/apache-kafka-supports-200k-partitions-per-cluster/

just to be clear with the naming of Kafka and confluent platform :wink:
have a look at

hth,
michael