There’s a new Streaming Audio episode - check it out!
The data mesh architectural paradigm shift is all about moving analytical data away from a monolithic data warehouse or data lake into a distributed architecture—allowing data to be shared for analytical purposes in real time, right at the point of origin. The idea of data mesh was introduced by Zhamak Dehghani (Director of Emerging Technologies, Thoughtworks) in 2019. Here, she provides an introduction to data mesh and the fundamental problems that it’s trying to solve.
Zhamak describes that the complexity and ambition to use data have grown in today’s industry. But what is data mesh? For over half a century, we’ve been trying to democratize data to deliver value and provide better analytic insights. With the ever-growing number of distributed domain data sets, diverse information arrives in increasing volumes and with high velocity. To remove the friction and serve the requirement for data to be consumed by operational needs in various use cases, the best way is to mesh the data. This means connecting data through a peer-to-peer fashion and liberating data for analytics, machine learning, serving up data-intensive applications across the organization, and more. Data mesh tackles the deficiency of the traditional, centralized data lake and data warehouse platform architecture.
The data mesh paradigm is founded on four principles:
- Domain-oriented ownership
- Data as a product
- Data available everywhere in a self-serve data infrastructure
- Data standardization governance
A decentralized, agnostic data structure enables you to synthesize data and innovate. The starting point is embracing the ideology that data can be anywhere. Source-aligned data should serve as a product available for people across the organization to combine, explore, and drive actionable insights. Zhamak and Tim also discuss the next steps we need to take in order to bring data mesh to life at the industry level.
To learn more about the topic, you can visit the all-new Confluent Developer course: Data Mesh 101. Confluent Developer is a single destination with resources to begin your Kafka journey.
- Zhamak Dehghani: How to Build the Data Mesh Foundation
- Data Mesh 101 Course
- Saxo Bank’s Best Practices for a Distributed Domain-Driven Architecture Founded on the Data Mesh
- Placing Apache Kafka at the Heart of a Data Revolution at Saxo Bank
- What the Heck is a Data Mesh?! By Chris Riccomini
- Watch the video version of this podcast
- Join the Confluent Community
- Learn Kafka on Confluent Developer
- Live demo: Event-Driven Microservices with Confluent
- Use PODCAST100 to get $100 of Confluent Cloud usage (details)