visit
User can create new feed through Feed Service, the feed contain location and caption. The Hashtag API should handle hashtag from the feed’s caption. Each service may written using different language.Because I want to keep it simple, I decided to build the Feed Service using Lumen and the Hashtag Service using Golang. So the system will look like this
First Design Until this step, everything looks okay. When a user creates a new feed, the Feed Service will handle it. Here is how the JSON data look like when user create a new feed
After the new feed stored, feed_id
and feed_caption
will send to the Hashtag Service to get all hashtag from feed’s caption and store them to Hashtag DB. The Feed Service will send the feed data as JSON to Hashtag Service by REST API call. The payload will be like this
After Hashtag Service received the request, it will filter all hashtag from the caption. The extract hashtag function is simple, just iterate each character in the caption and store them to an array. Each hashtag from the array will be paired with the feed_id
and store them to the Hashtag DB. So, here the extracted hashtags
One advantage of using microservices architecture is service reusability. One service can be used by multiple services. As we scale our system, now our users can create comment and event in our application. So we need to create Comment Service and Event Service. So, we can reuse the Hashtag Service to handle hashtag from Comment Service and Event ServiceFrom the problem statement above, we can redesign our system. Here it will look like.
Second Design
Technically it still works. But how if there is a lot of upcoming requests from Feed Service, Comment Service, and Event Service to the Hashtag Service? How the single Hashtag Service handle all the requests? Remember, as can as possible we don’t want to miss even a single hashtag. So we need to make sure every hashtag should be processed.
The naive solution is to pair Hashtag Service to each other services. If there are three different services use the Hashtag Service, we need to create the Hashtag Service to three instances.Naive Solution Technically it will work. But this design is breaking the reusable concept. From the design, we know that we not reuse the Hashtag Service but creating new. So what other solution that can help us?
RabbitMQ Applied Sow how the communication will be? Let me explain. After a feed or a comment or an event stored into the database, the service that handles it will send a payload that contained its ID and field that contained hashtag. Here the payload from Feed Service
Messages that published to RabbitMQ server will stay there until there is subscriber take it. Even if there is a message that not fully processed, the message will not directly delete from the queue. The RabbitMQ Server will delete it after receiving — we need to configure it before. After the subscriber available, the RabbitQM Server will send messages from its queue to the subscriber.
But how if the RabbitMQ Server down? We will lose the data too. Read for more detail information. Marking messages as persistent doesn't fully guarantee that a message won't be lost. Although it tells RabbitMQ to save the message to disk, there is still a short time window when RabbitMQ has accepted a message and hasn't saved it yet. Also, RabbitMQ doesn't do fsync(2) for every message -- it may be just saved to cache and not really written to the disk. The persistence guarantees aren't strong, but it's more than enough for our simple task queue. If you need a stronger guarantee then you can use . -I think that was all that I can share for you. I’m myself still need to explore more about messaging system and microservices. If there any critic or feedback, feel free to response this article or reach me at [email protected]
. Oh ya, you also can find the full code of and the .