You signed in with another tab or window.Reload to refresh your session.You signed out in another tab or window.Reload to refresh your session.You switched accounts on another tab or window.Reload to refresh your session.Dismiss alert
CQRS & Event Sourcing are "THE BUZZ" words these days and how it fits in real time analytics. Microservices architecture causes segregation of applications (also called mushrooming of application landscape). This posses challenge for the Data Engineering teams. The proposed architecture (in the GitHub blog) demystifies CQRS + Event Sourcing in Real time analytics and hence bridges the gap between Full Stack Developers & Data Engineer.
NOTE: CQRS + Event Sourcing = Elegant DDD
Event Souring - “All changes to an application state are stored as a sequence of events.” Martin Fowler
Change made to state are tracked as events
Event are stored in event store (any database)
Use stored events and summation of all these events (always arrive in current state)
NOTE Event Sourcing is not part of CQRS
The Change Data Capture (CDC) provides an easy mechanism to implement these. Further CDC, in the world of micro-services, data services & real time analytics is a central part of a modern architecture these days.Check out my GitHub project which demonstrates:
CQRS
Event Sourcing
CDC using Debezium (good bye to expensive CDC products & complicated integration)
Kafka Connect
Kafka
Real time streaming
Spring Cloud Stream
CQRS Bank Application - Using CQRS + Event Sourcing with Events relay using CDC
A bank application which demonstrates CQRS design pattern. This application performs following operations:
Money withdrawal using debit card
List all the money withdrawal (mini bank statement)
This application uses following two tables for above operations:
debit_card
money_withdrawal
A debit card withdrawl operation is stored in debit_card table. Once the transaciton is successfully committed in the debit_card table, using Debezium's & Kafka connect the CDC is moved to Kafka. Once the message arrive in Kafka topic, using Spring Cloud Stream Stream Listener, an entry in made to money_withdrawl table. This table is used to create mini statement (query)
NOTE For sake of simplicity same DB is used but as can be seen - "a command to perform debit operation" is separated from mini statement.
Following picture shows architecture of this application:
To demonstrate OLAP capabilities, this application write mini statement to Kafka topic - "ministatement". From this topic, Druid picks it up and provide fast querying abilities
Pre-requisite
MySQL
Apache Kafka
Kafka Connect
Debezium
Spring Cloud Stream
Zookeeper
Druid
Docker
NOTE: This application is completely dockerized.
Run Application
The complete reference architecture implementation is dockerized. Hence it takes just few minutes to run this application locally or any cloud provider of your choice.
Execute following steps to run the application:
Build bank app
mvn clean install -DskipTests
Run bank application complete infrastructure:
docker-compose up
Instruct Kafka Connect to tail transaction log of MySQL DB and start sending messages as CDC to Kafka:
curl -i -X POST -H"Accept:application/json" -H"Content-Type:application/json" http://localhost:8083/connectors/ -d @mysqlsource.json --verbose