I’m not against monolithic applications.
I strongly believe that 90% of the applications could do well as a monolith. Maybe the percentage is even higher.
But sometimes you must migrate some functionality from a monolith to an independent service.
It might be due to regulatory reasons. Or the functionality has a different release and deployment cadence than the rest of the monolith and it no longer makes sense to keep it as part of the monolith.
Whatever the reason, the biggest hurdle in extracting such a service is the likelihood of creating an unintentional impact on the rest of the system.
How does one gain the necessary courage to make the change?
There are two approaches that I’ve found to work quite well.
1 - Compare and Evaluate
Let’s say you want to extract critical functionality from your monolith into a new service.
This functionality involves both reading and writing to the database.
Here’s how compare and evaluate can help.
Dealing with Reads
It’s easier to handle reads because reads are idempotent.
Once the service is ready, you have two read paths:
Path A goes through the monolithic app
Path B goes through the new service
So how do you compare?
You go for dual reads. See the diagram below:
For every read request, you do the following:
Capture response from path A and path B
Emit the responses as standard events
Send the events to a comparison framework
Ramp up the traffic, compare responses, and make adjustments
Once the comparison looks clean, move all reads through the new service (path B)
Dealing with Writes
With writes, things are different. You cannot perform dual writes to the same database.
Instead, you utilize a shadow database.
For every write request, you do the following:
Write to the production database via the monolith (Path A)
Also, write the same thing via the new service to a shadow database (Path B)
Read data from the production and shadow database
Send the data to a comparison framework
Ramp up the traffic, compare the states, and make adjustments
Once things are clean, move all writes through the new service to the production database
2 - CDC-based Strangler Fig
For one of the projects, we also used the Strangler Fig approach using Change Data Capture (CDC) and Kafka to make the migration smooth.
See the diagram below that shows the migration process for a specific functionality from the monolith.
Here’s how the whole thing worked for us:
Our monolithic application supported various features and stored all data in a MySQL database.
We extracted Feature A into a separate service (Service A) with a new MongoDB database.
Next, we built a CDC workflow with Debezium to move data from MySQL to MongoDB via Kafka. MongoDB was part of a specific requirement for the new service but it could be any other database as well.
We placed a proxy (Nginx) to route read requests to Service A. All other requests went to the monolith including write requests to Feature A.
Once satisfied with the read results from Service A, we migrated the writes as well
In the end, feature A was no longer supported by the monolith and all requests went to Service A.
A couple of points to note about the process:
👉 Why Kafka?
Kafka provided some nice benefits such as:
Keeping the monolith and new service decoupled
Ordering guarantees for the messages
Great support with Debezium
👉 Why start with reads?
This was a critical system and we didn’t want to mess up the writes.
Moving the reads gave the team experience with the new architecture. It also allowed us to run comparisons between the monolith responses and the new service similar to evaluation approach.
This is part of the safety-first approach of the Strangler Fig pattern.
So - what approach have you used for extracting critical functionality from an existing application?
Eraser Professional Plan Free Trial (Affiliate)
As you all know, I use the Eraser for drawing all the diagrams in this newsletter.
Eraser is a fantastic tool that you can use as an all-in-one markdown editor, collaborative canvas, and diagram-as-code builder.
And now you can get one month free on their Professional Plan or a $12 discount if you go for the annual plan. The Professional Plan contains some amazing features like unlimited AI diagrams, unlimited files, PDF exports, and many more.
Head over to Eraser and at the time of checkout, use the promo code “CODEX” to get this offer now.
Shoutout
Here are a few interesting articles I read this week:
The Upgrade Gamble: When to Fold by
5 reasons to have someone leading a project by
15 React Component Principles & Best Practices by
Become a Great Engineering Leader in 12 Months by
That’s it for today! ☀️
Enjoyed this issue of the newsletter?
Share with your friends and colleagues.
See you later with another value-packed edition — Saurabh
Love this, awesome post my friend Saurabh! I also prefer starting monolith first, then scaling up problematic parts with extracting them to microservices, if it is justified.
Strangler Fig approach using Change Data Capture (CDC) and Kafka that's a great solution, @SAURABHDASHORA