Kafka consumers reaching max memory assign when using key to publish message

I am producing kafka messages using key as an id , this id can be similar for millions of records but not the same.
I see my consumers reaching max memory and restarting this causes the kafka cluster to rebalance and lag to increase a lot.
Any help on how to solve this is much appreciated.

Is this a question about Kafka Stream, the stream processing library, or about a plain Kafka Consumer application? For the later maybe as in clients section?