Continue processing on uncaught exceptions in Kafka streams

I have a Quarkus based application that leverages Kafka streams for consumption and publishing of messages. We are thinking of implementing a global error handler that would deal with different type of exceptions that application can possibly encounter (Processing logic failures/ dependent API or system failures/ other business exceptions). It seems these exceptions would fit under ‘StreamsUncaughtExceptions’ and the uncaught exception handler only provides a way to:

  1. replace the current thread (essentially retrying)
  2. Kill streams client/ app.

on encountering such exceptions

But we would want a mechanism to continue processing even if there is an exceptional scenario. It has to be like -

A message is consumed
→ Does some application logic
→ Gets some exception
→ Global handler handles it
→ Commit the offset
→ Read the next message.

We would not want the app to crash/ retry indefinitely as we are fine to proceed with this state. How to achieve this behaviour / effectively tackle this scenario, as there are no resources that points this can be done with the current streams setup. (Having try/ catch at the each potential exceptional points are not encouraged as it makes the code a mess and costs readability)

The StreamsUncaughtExceptionHandler cannot be used for this. It’s too late at this stage - the stream thread has failed at this point. It can only be restarted or the application can be shutdown.

There is no single exception handler that can do what you want to do. At the moment, your best option is to define a DeserializationExceptionHandler, and ProductionExceptionHandler, and wrap your processing logic into try/catch blocks to detect and handle any error during processing. If you are using Scala, you may be able to use the Try class to solve this in a relatively clean way, I agree that all those try/catch-blocks will not be nice in Java.

A more convenient way would be to define a global processing exception handler, and there is a Kafka Improvement Proposal that is close to being accepted and implemented - so it’s likely that Kafka Streams will get this in 3.8 or 4.0. Unfortunately, you’d have to wait until then.

See KIP-1033: Add Kafka Streams exception handler for exceptions occuring during processing - Apache Kafka - Apache Software Foundation

Another option to look into would be Michelin’s kstreamplify library, which I believe has some facilities to make handling processing errors easier - although I don’t know the details.

1 Like

Thanks @lbrutschy for your detailed response and this helps. This is in-line with my understanding of the current K stream setup.

The proposal - KIP-1033: Add Kafka Streams exception handler for exceptions occuring during processing - Apache Kafka - Apache Software Foundation to add an exception handler for tackling processing time exceptions seems interesting and fits my requirement.

1 Like

This topic was automatically closed 7 days after the last reply. New replies are no longer allowed.