** I’m new so if this is in the wrong place please let me know
I’m working on a data mesh strategy - nothing too complicated. Super briefly, my team will put together a java library defining all our events, update the schema registry via the maven plugin and facilitate domain owners to produce to our cluster in Confluent Cloud.
The most problematic question I’ve had is around how we ‘guarantee’ all our read models are consistent.
The concerns are two-fold:
Are asynchronous events ‘good enough’ compared to a database where we can guarantee consistency? Especially as we probably can’t ensure the single writer principle so reasoning about the order of events could get very tricky.
In the real world, how do we manage multiple consumers, all of which drive decision making, but only some of which are real-time? Meaning different read models getting in the hands of business colleagues. Words like: “at least in a data warehouse, the data is out of date but 100% consistent” are spoken
For those of you that are further down the road than I, did you come up against questions/problems like these?
I appreciate your time.