I've moved to the project which is actively using CQRS + event sourcing. From the first glance it's implemented in accordance with all those books and blogs, but finally I realized what exactly is peevish in the implementation.
Here is CQRS architecture:
Originally I took this picture from here.
As we can see in the picture, the read side receives events from the queue and passes it one by one into different sets of projections(denormalizers) and then resulting ViewModels are saved through AddOrUpdate method into, say, DB. So as I understand from the picture denormalizer can rely only on the event itself plus data from read-side db. For instance:
- Account view already stored in the db.
- EmailChanged event arrives
- We read Account view from the DB
- Applying Email change to it
- We save Account back into DB.
Another case(counting number of some items, say orders):
- OrderCreated event arrives
- We read the ViewModel which represents NumberOf previously arrived orders
- Increment and save this.
What we have in our project: We use all those events only as a notifier that something changed in the domain model. Hence, what we do:
- We take domain repository and read all the necessary aggregates. Doing so we get the most recent state of them.
- We just build the ViewModel object from scratch
- Save newly created object into Db
The approach we use in our project looks a bit strange to me, I can't see all the drawbacks of it though. If we need to rebuild our read side, we add "active" denormalizer and next time it receives a particular event, it recreates the new viewmodel.
If we use approach from the books, I will have to have a separate utils logic somewhere out of my system for rebuilding. What we need for this:
- Drop the read side
- Read all the events from the event store from the beginning
- Pass them through the projections
So my question is:
What is the right approach in here?