How we manage concurrency issues in Graffi-Tee

Avinash Jaiswal
5 min readJul 5, 2020

From the night when this idea struck me to create an application for our COVID Graduates where they could celebrate Graffiti Day virtually, I was quite keen to develop it. Writing out feelings for your friend on a plain white T-shirt has been a great way to express love and bonding and providing this feeling on a digital platform was a huge challenge in itself. You can very well go to the website following this or can watch this video which explains the working.

I immediately asked my small group of buddies to come and work along on this very rudimentary idea of mine, and to my luck, they agreed in the nick of time. I even presented them with a prototype of what I was thinking and how the architecture is going to look like. Everything at this stage was very much “in the mind” and to make that happen for real was the task we embarked to take ourselves on.

So this was going to be our stack for the application. A NodeJs backend, supported by express and JsonWebToken for route management and authentication and authorization. We have a MongoDB database for all the database related operations. This was supported with an Angular Frontend and an open-source image editor that we found of GitHub called Toast UI image editor.

Application Architecture

While developing the various scenarios we came to realize the fact that managing concurrent data updates in our database is the major problem that our application would face in time. Consequently, I decided to implement a mutual exclusion paradigm that we read in the Operating System course in college, called semaphores or mutex.

A semaphore is a programming construct that binds your critical section. A critical section is that part of your code that is going to make updates to your database.

So for a present request, a semaphore locks the database update process and when the update is made, releases it for use by the next request. To implement this I got a very nice promise based semaphore library on npm called await-semaphore and I straightaway dived into implementing it at all the places required.

So far so good. We thought that we have managed the concurrency problem that our application would face and is now ready for facing the real world. As a result, we hosted it on an EC2 instance and the next day asked our immediate group of friends, around 20 of them, to test it out and tell us about their experience.

After an hour of usage, it came to our attention that some of the edits made by them were getting lost or missing and that our application is still not able to manage concurrency issues. I quickly jumped to my backed, tried all the possible debugging steps to check when the requests are being executed and found that they were getting processed as expected. The semaphore was working just fine and there were no such concurrency issues for which our backend could be held responsible.

I immediately asked for a meet with my friends and explained the situation. They too were not able to figure out what the real cause of the problem has been. I ended the meet and went for a short walk. It was not very late when it hit me what the real cause of the problem could be. I rushed back to my phone and called my friend Rogin to discuss it. He too agreed that yes this could be the reason and I knew that what followed is an all-nighter coding round and inception of an application that is highly performant and can manage concurrency on the front as well as the backend.

State Diagram for Requests to Backend

As you can see in the diagram above, the problem was that, suppose there are two users User1 and User2 who are currently going to update the Tshirt for a User3. Let us assume that both the Users, User 1 and 2 starts from the same state of the t-shirt say state 1. After the edits were made by User 1 the t-shirt will go to a state say, State 2 and after the edits made by User 2, the state would go to, say State 3. Now since the starting state for both the Users, User 1 and 2 are same, the updates made to the Tshirt does not account for the changes made by either of them previously. So what essentially happened was the fact that the User who was able to make the update first, loses his update when update by the next user is made, if they both start from the same state.

The only solution to this problem was to stop users from opening the edit page for a t-shirt when already someone is at the page and editing it. To implement this mutual exclusion on the frontend I decided to use sockets. To understand the next paragraph you must have a little understanding of how sockets work and the concept of rooms. I have used socket.io for the same.

So when a user enters the edit page for a t-shirt, he joins a room. The room is a universally unique identifier string for every t-shirt. On the backend, we calculate the number of users in the room and maintain a data structure which keeps a track of the numbers and the user ID of the users presently entering a t-shirt room. We maintain a scenario in which only the first one to enter the room is allowed to edit and others are asked to wait. We also provide a 180 seconds time limit to edit a shirt so that other users are not left waiting indefinitely. When a user leaves the edit page his socket is disconnected and he is removed from the array list. The next in the list is given the chance to make the edits. Similarly, we imply a code to manage the list when users leave the socket.

When we finally published this update, our application faced a few issues that were server related and I needed to write a new config file for the Nginx server that was able to manage the socket requests as well. On successful integration of the entire system, we got a wonderful application that would help you to send your love to your loved ones and also keep persistence intact and guaranteed.

Though a side project, and though we built the core in under a week, Graffi-Tee is very close to our hearts. We have also published it on product hunt and also created a youtube video explaining its working. Do tell us in comments how you like the app and what features you would want us to implement so that we can grow and extend this to your needs. Happy to help!!

--

--