In this tutorial, we’ll see Microservice Design Patterns — Event Carried State Transfer to achieve the data consistency among the Microservices.
In the traditional monolithic architecture which consists of all the modules for an application, we have a database which contains all the tables for all the modules. When we move from the monolith application into Microservice Architecture, we also split our big ball of DB into multiple data sources. Each and every service manages its own data.
Now, having different databases and data models bring advantages into our distributed systems architecture. However when we have multiple data sources, obvious challenge would be how to maintain the data consistency among all the Microservices when one of the modifies the data. The idea behind Event Carried State Transfer pattern is — when a Microservice inserts/modifies/deletes data, it raises an event along with data. So the interested Microservices should consume the event and update their own copy of data accordingly.
We can maintain data consistency across all the microservices using Apache/Confluent Kafka. This approach avoids many unnecessary network calls among microservices, improves the performance of microservices and make the microservices loosely coupled. For ex: Order-service
does not have to be up and running when user details are updated via user-service
. User-service
would be raising an event. Order-service
can subscribe to that whenever it is up and running. So that information is not going to be lost! In the old approach, it makes microservices tightly coupled in such a way that all the dependent microservices have to be up and running together. Otherwise it would make the system unavailable.
CREATE TABLE `test`.`users` (
`id` INT NOT NULL,
`firstname` VARCHAR(45) NULL,
`lastname` VARCHAR(45) NULL,
`email` VARCHAR(45) NULL,
PRIMARY KEY (`id`));
INSERT INTO `test`.`users` (`id`, `firstname`, `lastname`, `email`) VALUES ('1', 'Neha', 'Parate', 'neha.parate@gmail.com');
INSERT INTO `test`.`users` (`id`, `firstname`, `lastname`, `email`) VALUES ('2', 'Aravind', 'Dekate', 'aravind.dekate@gmail.com');
INSERT INTO `test`.`users` (`id`, `firstname`, `lastname`, `email`) VALUES ('3', 'Mayur', 'Devghare', 'mayur.devghare@gmail.com');
INSERT INTO `test`.`users` (`id`, `firstname`, `lastname`, `email`) VALUES ('4', 'Suchita', 'Vinchurkar', 'suchita.vinchurkar@gmail.com');
CREATE TABLE `test`.`product` (
`id` INT NOT NULL,
`description` VARCHAR(500) NULL,
`price` INT NULL,
`qty_available` INT NULL,
PRIMARY KEY (`id`));INSERT INTO `test`.`product` (`id`, `description`, `price`, `qty_available`) VALUES ('1', 'IPad', '300', '10');
INSERT INTO `test`.`product` (`id`, `description`, `price`, `qty_available`) VALUES ('2', 'IPhone', '650', '50');
INSERT INTO `test`.`product` (`id`, `description`, `price`, `qty_available`) VALUES ('3', 'Sony TV', '320', '100');
Record from databasedb.getCollection('purchase_order').find({})
{
"_id" : ObjectId("60a536412306ab5bdd7a6b06"),
"user" : {
"id" : NumberLong(1),
"firstname" : "Parate",
"lastname" : "Parate",
"email" : "neha.parate@gmail.com"
},
"product" : {
"id" : NumberLong(1),
"description" : "ipad"
},
"price" : 300.0,
"_class" : "com.example.demo.model.PurchaseOrder"
}
POST Request
curl --location --request PUT 'http://localhost:8080/user-service/update' \
--header 'Content-Type: application/json' \
--data-raw '{
"id" : 1,
"firstname" : "Neha",
"lastname" : "Parate",
"email" : "neha.parate@hotmail.com"
}'
Now MySQL DB has been updated and kafka event have been raise and listen by order-service to update the purchase_order
collection. Now check purchase_order
should have updated details
MySQL
mysql> select * from users;
+----+-----------+------------+------------------------------+
| id | firstname | lastname | email |
+----+-----------+------------+------------------------------+
| 1 | Parate | Parate | neha.parate@hotmail.com |
| 2 | Aravind | Dekate | aravind.dekate@gmail.com |
| 3 | Mayur | Devghare | mayur.devghare@gmail.com |
| 4 | Suchita | Vinchurkar | suchita.vinchurkar@gmail.com |
+----+-----------+------------+------------------------------+
4 rows in set (0.00 sec)
Control Center Confluent Kafka
http://localhost:8081/order-service/all
[
{
"id": "60a536412306ab5bdd7a6b06",
"user": {
"id": 1,
"firstname": "Parate",
"lastname": "Parate",
"email": "neha.parate@hotmail.com"
},
"product": {
"productId": 1,
"description": "ipad"
},
"price": 300.0
}
]
Source Code link — https://github.com/javaHelper/saga/tree/master/kafka-event-driven