Rue du Commerce - Marketplace

Merchants catalog platform overhaul


Created in 1999 by Patrick Jacquemin, Rue du Commerce has become one of France`s most important e-commerce websites. Since 2007, retailers can benefit from the websites` audience and success through a common marketplace. They can put their catalog online with a platform developed by Rue du Commerce`s teams. In 2012, to consolidate its leadership, Rue du Commerce wanted to redesign it`s platform to meet the expected performance and quality levels: containing 20 million offers, uploading all the catalogs within 2 hours and modifying an offer in a few seconds time.


Rue du Commerce`s system was based on a complex architecture that guaranteed the data integrity but also had relative disadvantages making it obsolete: full catalogs and synchronous processing resulted in a total daily processing time of 13 hours…

af83 committed itself to develop a new solution that would upload millions of offers and thousands of catalogs within two hours, suppress the synch constraints and finally enable better follow-up and the updating of offers nearly in real time.

All of this without any performance outage and a guaranteed data integrity, two crucial points for the company and its retailers.


Given the size and the complexity of the existing platform, af83 took the following approach: cutting the mission in three independent sub-projects and integrating new components in the existing core.

Decreasing the global processing time

af83`s main task was to decrease the processing time to two hours. To do so, the platform had to process only the necessary daily products. So af83 generalized to all the retailers the differential processing method, first developed by Rue du Commerce for its key accounts. We also made the whole process easier for the retailers by handling the versioning and avoiding the need to change their catalog format. To achieve this, we implemented a cache system to process differential method additions, modifications and suppressions.

This new system was developed along with the first version of a new unit processing service, allowing retailers to update their catalog product by product through a dedicated API. This operation was as secure and efficient since it decreased the global processing time to three hours.

Suppressing the synch constraints and managing retailers’ priority

For this sub-project, we had to rework the architecture core itself: initially, data was first written down on a copy of the production database. To suppress the synch constraint, this data had to be written down directly on it. Yet, an important part of the existing application system was developed based on this switch.

We changed the setup of the components in charge of data writing. This way it could write on the new database, process smaller tasks and process several data sets simultaneously. The processing sequencing was also completely reworked.

We also chose a queue system that would manage the catalogs priority for each worker. Finally, we also updated the API to give priority ranks to the merchants and improve it: it would now launch a new process for each new request.

Monitoring data processing and handling errors

We settled down for a MongoDB database, offering all the power of the NoSQL solutions (lightweight, easy data modeling, high writing and reading speed), while allowing for easy data indexing and writing complex requests.

We also added scripts recording the status and priority in the queue to every component. And finally we developed a web application to display all the processing data in an administration dashboard through a RestFul API.

The second goal was to optimize the handling of errors, particularly on the format of merchant catalogues. Originally, the catalogue and data formats validation happened on the first step of the processing, and an error on a single product resulted in the failure of processing the whole catalogue. To solve this issue, we rewrote this phase and added brand new features. Errors are now part of the reporting data available through the API, and the error messages are more relevant for the merchants and RDC functional team.


To achieve this great success, around ten of our developers worked closely with the RDC Development, Support and System Administration teams. We have proven that it is possible to put new components based on modern and powerful technologies within a complex existing system, while adding new features: real-time treatment at anytime, reporting data available in real-time and on demand through an API… all of this with no service interruption.

Our team built an efficient platform:

  • Two seconds to modify an offer, 1h30 to upload 700 catalogues
  • A scalable system capable of taking in millions of offers
  • Open APIs that the merchants can link to their Information System (for example, by connecting them to their ERP)
  • The marketplace has since seen an increase of 800.000 offers in six months only