Hi, I got a project at hand and Im learning Kafka for it and an expert suggested me to “use Mongodb as primary database and sync via Kafka to Elasticsearch with bulk upsert”. Before I could ask more questions he disappeared from the chat. Its been over a month and I need to know exactly how would I do this.
Its a dating app / website which is bound to receive tons of users. He mentioned I shouldnt be using Elasticsearch as a primary database because it gets screwed often so I would need to use Mongodb as primary database and whenever elasticsearch data gets corrupted or something screws up I could should somehow create a new index or something (Im not clear at this part) and get all data from mongodb using kafka. One of the questions I have right now is how can I setup everything so that all that happen on autopilot? How would the system detect that the elasticsearch got screwed in the first place?
Elasticseach would hold the user’s preferences (what a user is looking for in their potential partners) etc. As there will be a lot of searching by users for other users we are using Elasticsearch.