Confluent Kafka Architecture

Hi,

I have a project with required endpoints Zookeeper, Kafka, and Connect. I will source in Oracle store it to Kafka and the stream to Elasticsearch. Will I be needing Schema Registry? I’m quite confused with the use of Schema Registry.

Hi @hasmine.roldan , what connectors will you use? Connectors usually leverage leverage schema registry , so most likely the answer is yes. Actually it is a very good example of why schema registry is so useful: source and sink connectors can be see as 2 different "applications. Schema registry allows you to ingest data and then sink it to another system without having to manually code the structure that the data has. It is auto-magically handled by the connectors and Scherma registry :slight_smile: Here more detils: Using Kafka Connect with Schema Registry | Confluent Documentation

Let us know how it goes!

Schema Registry is not required. You may use Apache Kafka’s (not “Confluent Kafka”) included JSONConverter class, for example with schemas enabled (or disabled) and the Elasticsearch sink and Oracle source connectors should operate just fine.

However, if you want to have guaranteed knowledge and type definitions about what exists in those Kafka topics, as well as lesser storage space occupied in those topics by using non plaintext formats, then you may want to consider using a Schema Registry

Also, Confluent Platform includes