We have been working on a B2B eCommerce solution for a North American enterprise designed to bring convenience to wholesalers and retailers in the mobile repair parts industry. The platform has been built using the Magento 1.9 framework and is backed by a MySQL database that ensures seamless performance.
Through the years, our hard work has paid off as the web application has grown significantly, constantly evolving with new and improved features to enhance the user experience while managing complex B2B eCommerce operations efficiently. And today, we are proud to see the website process over 5000 orders daily.
However, this growth wasn’t achieved without challenges. The journey started with low order numbers, but as we continued to optimize and innovate, the platform attracted more and more traffic, leading to higher conversion rates. While this was an exciting development, it also brought new technical challenges that we have overcome with determination and commitment.
The technical challenge
Our system is deployed in the Magento framework. We have developed a customized solution per customer requirements to manage incoming orders, which includes several third-party apps and various database calls. Our solution worked fine with a slightly longer order processing time. The average time to process an order should be 2-3 secs, but in our system, it was 7-8 secs, and if several orders were being executed simultaneously, the wait time was 10-12 secs. Although the time was slightly longer, it wasn’t a problem since the system’s core functionalities weren’t getting affected in any way. But the system started encountering problems when the number of orders increased dramatically.
As the business continued to grow and the volume of daily orders increased, the demand on our system became too high, and we started to experience missed orders. This meant that while customers could place orders, add items to their carts, and successfully complete payments, the orders were not being accurately recorded in the backend of our system.
This was a major concern as the ability to effectively manage orders is at the core of any eCommerce platform. A malfunction in this critical component can lead to difficulties in inventory management and satisfying customer orders accurately.
The root cause of the issue was:
- When there were several order requests placed simultaneously, the database tables that were frequently called resulted in a deadlock
- Because of the deadlock, the transactions used to get aborted in between, resulting in a rollback with no orders being actually recorded
The reason for the increased load on the database was because of the execution of six data-intensive functionalities at the checkout which were:
1. The functionality of pre-order, so multiple orders are created at the same time in addition to the normal order that the customer places
2. Generating the Shipment labels for FedEx/UPS/USPS etc.
3. Checking the Fraud using a third-party API
4. Updating the order in ElasticSearch
5. Syncing the stocks from one warehouse to another since we manage multiple warehouses at different locations
6. Updating the order in Firebase
We implemented a few solutions to address the problem, such as database optimization (reducing the number of calls to the database) and moving some parts of the code to the cart page instead of the checkout page. These solutions worked for us and performed well to some extent, but we faced the issue again when the orders increased. Also, when we moved some parts of the code to the shopping cart page, it increased the unwanted load on the cart page. If the cart page isn’t user-friendly or is slow, it can directly impact the shopping cart conversion rates and lead to increased cart abandonment rates.
The proposed solution
To overcome the issues; the Tecstub engineering team came up with the following solution:
1.) Reducing the total number of database calls
2.) Decreasing the total number of operations at the checkout and placing them somewhere else
3.) Make some API calls to execute time intensive processes
How was the solution implemented?
1.) Reducing the total number of database calls
We started with identifying the features where the total number of database calls can be reduced. So, for each database query that was called and updated the database 2-3 times, we found a solution to reduce it to a single database call. What we did was update those extra column values into the 1st call that is being executed.
2.) Decrease the total number of operations at the checkout.
For all the operations that weren’t primarily needed during the checkout process, we added them to a cron file. The cron file was executed every 4 minutes. So, the orders were scanned and processed, and simultaneously, the extra functionalities that were added to the cron file were executed every 4 minutes. This particular solution helped decrease the load during the checkout process significantly.
3.) Calling APIs to execute time-intensive processes and reduce the load
We created the shipment labels during the checkout process, which would take 3-4 secs to execute. Once this execution was complete, it would initiate the next step of order processing. To overcome this dependence and save time, we created an API for generating the shipping labels with a timeout set to 1 second. Now we don’t have to wait for the response to initiate the further processing of orders. The API generates the label and stores the tracking number in the database. If there is an error during the API execution, then we have the logs that we can review, but it won’t stop the order processing script from executing, and the order can be placed successfully in less amount of time.
Impact
Our continuous efforts have helped us to improve the average order processing time. The system now takes about 2-3 seconds compared to 7-8 seconds before implementing the changes. Previously, our system struggled to handle just 800-1000 daily orders, but now it seamlessly processes a remarkable 5000 orders daily, a fivefold increase from before.