How We Created a Multi-site Setup for a Vehicle Marketplace: Challenges, Solutions and Insights

Codica Team
Codica Journal
Published in
14 min readFeb 8, 2019

The article was originally published on Codica Blog.

For the past several years, the community demand for web development has been going up, and this trend is not going to stop. It includes creating complex applications like marketplaces that should cover all users’ needs and stand out from other web solutions.

Today we want to tell you how we created the first multi-vendor vehicle marketplace in Africa, that enables its users to buy and sell new and second-hand cars, motorbikes, and trucks. The customer’s idea was to create a global marketplace solution which would cover numerous regions of Africa and South America, and connect buyers with both car dealers and private sellers.

When we started working on this project, a certain number of websites already existed, and our task was to extend the platform size. Considering all the pros and cons, our team decided to create a multi-site setup. At the moment, the platform includes 89 marketplace websites functioning separately, and the number is even growing. Let’s discover the main reasons for such a setup creation.

What is a multi-site setup?

Often confused with multi-domain setup, multi-site setup is a complex of independent sites.

Below is a comparison picture of multi-domain and multi-site solutions.

Let’s take a look at different aspects of both setups:

Despite the main advantage of a multi-domain setup — easy content management, our team chose a multi-site solution, as it provides indisputable advantages in terms of stability and maintenance.

Challenges

Building such a complex solution is not easy, and of course, we have experienced challenges. We will show you what they were and how our seasoned developers solved them.

Tech stack

Configuration

Challenge: Proper organization and maintenance of the setup

We have implemented a complex configuration system that allows to easily edit the logic and settings of each website. The configuration is stored on a single Git repository. It contains all the websites’ files with the templates and CSS files of the colour scheme.

To begin with, each website has its own file with the settings of supported currencies and other data. Here is an example:

site_name: NewSite 
site_url: Newsite.com
continent: Continent
country_code: AO
currencies: { 'AOA': 'AOA', 'USD': 'USD' }
emails:
contact_email: info@newsite.com
from_email: info@newsite.com
socials:
facebook: https://www.facebook.com/newsite/
google_plus: https://plus.google.com/u/0/+Newsite
instagram: https://www.instagram.com/newsite/
linkedin: http://www.linkedin.com/company/new-site
twitter: https://www.twitter.com/newsite
youtube: https://www.youtube.com/channel/newsite

Additionally, there is a possibility to manage all the sites at once. There exists a main HTML/CSS template which is common for all the websites. For example, if we change the markup in this main template, then it will be changed through all the sites.

Below is an example of a main template CSS file. If we change the variables in this file, then all the websites’ colour scheme is changed automatically. This option, of course, simplifies the configuration adjustment.

$color-1: #0052b4; 
$color-1-hover: darken($color-1, 20%);
$color-2: #cb032b;

Localization

Languages

When running the platform in multiple countries, the language issue can be challenging. This considers creating translations, as well as updates and management.

Moreover, if several languages are used in each country, localization becomes even more challenging, as you should meet all the user preferences in all the languages.

Our target was to consider all the possible languages for a particular region and implement them for the users’ convenience. To achieve this, our developers have created a fulfilling localization system that supports any number of countries and multiple languages in these countries.

Challenge 1. Provide fast and easy localization

Each platform website has several languages, including English and local languages. To adjust the languages, we have utilized Ruby gems that allow solving many problems in a short period of time as they simplify the development process.

We have adopted the i18n gem which is set by default, to provide internationalization solution for the platform. Up to date, we have created 15 languages support. Using the I18n gem allows us to easily set a default language for any website.

We have implemented four main languages: English, Spanish, Portuguese, and French, and 11 additional ones: Swahili, Kinyarwanda, Malagasy, Amharic, Yoruba, Hausa, Shona, Arabic, Wolof, Oromo, Tswana, Russian.

Moreover, using this gem we are able to localize such data as date and time, validation notification and other important information.

Challenge 2. Find and manage missing and unused translations

Managing such a huge platform setup, it is easy to miss some translations or have some unused ones. This issue is of great importance as it directly influences the user experience. That is why such a platform setup with 89 websites requires deliberate attention to all the translation files.

However, it is pretty tricky to manage this issue manually, as the data volume is extensive. Thereby we used I18n-tasks gem that automates all these processes.

For the platform development, our team utilized several main features of i18n tasks listed below:

  • An option to add missing keys with placeholders (base value or humanized key) that saves much time on localization management:
    $ i18n-tasks add-missing
  • Management of multiple locale files that saves much development and maintenance time.

With the help of i18n-tasks, we manage multiple translation files and read translations from other gems.

On the screen below, you can see the process of finding missing and unused translations that greatly simplifies the process of localization maintenance.

Challenge 3. Monitor and manage multiple translation files

It was critically important for the customer to easily manage the translation files.

To solve this issue, we have created an import and export function, using the mentioned gems (i18n, i18n- tasks) and Custom CSV import and export tasks.

Here’s how it works:

  • The customer would add translations into a Google Document.
  • This document would sync with our Git repository .yml files and therefore update in our locale files.

And vice versa, we are able to update that Google Doc through updating the data in the code base.

Here you can see a Google Doc screen with the ‘key’ value from the code base and its consequent values in different languages.

The main benefits of this tool:

  • The client cannot change the default language (in this case English) and the locale key. This prevents a potential crash of translation files configuration.
  • The developer cannot export files to Google Drive if two files differ from each other. Thus we make sure new translations are not deleted by mistake.
  • It provides an option of email notification about changes in translation files.

Moving forward, a new issue emerged: to manage the whole platform localization, we need to adjust the locale files quite often.

For these purposes, we have changed and improved the above mentioned Custom CSV import and export tasks function. Now it allows us writing data to separate files at once that gives a boost in localization management.

Let’s get acquainted with the mechanism:

def csv_export(_opts = {}) 
translations_by_path = {}
router = I18n::Tasks::Data::Router::PatternRouter.new(nil,
write: i18n.config['csv']['export'])
create_locales_file(i18n.locales) all_locales = i18n.locales << i18n.base_locale all_locales.each do |locale|
router.route(locale, i18n.data_forest) do |path, nodes|
translations_by_path[path] ||= {}
translations_by_path[path][locale] ||= {}
nodes.leaves do |node|
translations_by_path[path][locale][node.full_key(root: false)] = node.value
end
end
import_locale_in_csv(locale, translations_by_path)
end
End

Afterward, we are able to monitor and check all the files within Google Drive API and export locale files to Google Drive where we can edit and manage the translations smoothly and conveniently.

When we receive an email notification, we run an import function to get the data from Google Drive using this method:

def csv_import_from_google 
translations = []
file_links.each do |_local, token|
csv = File.read(URI.open(path_to_doc))
import_locale(csv, translations)
end
i18n.data.merge!I18n::Tasks::Data::Tree::Siblings.from_flat_pairs(translations)
end
def import_locale(csv, translations)
CSV.parse(csv, headers: true) do |row|
key = row['key']
next unless key
get_locale_from_files(row).each do |locale|
raise "Locale missing for key #{key}!" unless row.key?(locale)
translations << [[locale, key].join('.'), row[locale]] if row[locale].present?
end
end
end

Currencies

Challenge: Create a conversion system for multiple currencies

Taking into account a large number of locations and currencies, it was crucially important to provide the marketplace users with a convenient currency conversion system. This would help us ensure the best user experience.

Users needed the ability to set prices in their local currency or USD, and easily convert them.

For example:

  • User1 wants to sell a vehicle, and they set the price in their local currency.
  • User2 from another country wants to buy this vehicle and they want to see the price in his local currency, which is, of course, more convenient.

This task is challenging, as the number of sites and currencies is quite large.

For each website, we set several currencies, including USD and the local ones. A user is able to set the price in any listed currency, as well as convert it into any other available one.

To implement efficient currency conversion at the platform, we have adopted Money Gem. We have used its following features:

  • It provides a currency conversion API that allows sending requests to it and exchanging money from one currency to another.
  • Currencies are represented as instances of Money::Currency. It means we can get any important information about the currency such as iso code, full name, iso numeric, its subunits. All this helps us save plenty of time on retrieving some important data for further implementation.

Another issue that we faced is the fact that currency rates change rapidly.

To solve this, we have created a special automated service that updates currency rates hourly. We have used Cron Jobs on our server — it is a tool for scheduling certain recurring actions in the Rails application.

We have configured the tool to send a request to Money Gem API every hour, get the data on current currency conversion rates, parse the data and then implement it on the platform.

Here is how it works:

class UpdateCurrenciesRate < BaseService   def call 
base_rate = ExchangeRate.base_currency
Settings.currencies.to_h.values.each do |currency|
next if currency == base_rate
update_rate(base_rate, currency)
end
end
private def update_rate(base, target)
currency_rate = JSON.parse(open(api_link(base, target)).read)
save_rate(base, target, currency_rate.values.first)
rescue StandardError => error
puts error
end
def save_rate(base, target, rate)
Money.default_bank.add_rate(base, target, rate)
end
End

Testing

Challenge: Provide high-quality and timely testing of 89 websites

The number of platform websites was constantly growing, so our only option was automated testing. It is time-efficient and easy in maintenance, as we have a single code base.

Initially, we had no documentation for the site maintenance, so creating test cases was the first step towards automated testing. Further on, test cases would save much time for support.

However, test cases could not cover all the project requirements, and the following tools and practices were implemented to cover all the sites with automated testing.

Repository configured CI

The first step was to find a method for monitoring the created tests and their execution. For this task, we have implemented CI (Continuous Integration) configured to a GitLab repository that monitors created tests and their execution. The benefits of Continuous Integration are huge when automation plays an integral part of your workflow.

As the result, it returns a positive or negative result about code review which allows us to handle time-consuming tasks and avoid spending much time on recurrent operations.

Ruby Gems

RSpec refers to Behaviour Driven Development for Ruby. This gem helps monitor user behaviour on the site. It’s important to see what is working and what’s not. We do testing of different languages and currencies that allows us monitoring the platform working capacity and checking the setup performance.

Capybara is an acceptance test framework for web applications. It flawlessly runs in tandem with RSpec as its add-on. Utilizing this gem, we execute integrated tests that imitate users’ actions in a browser. We can describe the authorization scenario in a few lines (go on the homepage, enter a login and password, click the Login button), , and the gem then suggests various convenient methods for tests debugging.

Here is the usage of Capybara with RSpec:

describe "the sign-in process", type: :feature do 
before :each do
User.make(email: 'user@example.com', password: 'password')
end
it "signs me in" do
visit '/sessions/new'
within("#session") do
fill_in 'Email', with: 'user@example.com'
fill_in 'Password', with: 'password'
end
click_button 'Sign in'
expect(page).to have_content 'Success'
end
End

With the help of these gems, we can run the system of 120 automated tests per 5 minutes. Of course, this greatly supports monitoring different errors while developing new functionalities.

Deployment

Challenge: Time-efficient deployment of the multi-site setup

To save time and effort, the best solution is to build automated deployment scripts. However, the number of sites was growing, and we needed to improve the deployment process. Let’s discover how we solved this problem.

Step 1. 15 websites: Capistrano

Initially, we have utilized Capistrano gem — a framework for building automated deployment scripts. Meanwhile, the platform was growing and reached the amount of 12–15 websites. Hereupon, it became much time-consuming to deploy more sites one-at-a-time and we arrived at an idea of using Mina.

Step 2. 40 websites: Mina

When the sites’ number grew up to 40, each site deployment took 2–3 min, which became time-consuming. For instance, the whole platform deployment took 2 hours. It spurred us to proceed to Mina — a deployer and server automation tool.

Using this tool, we spent only 40–90 seconds per each site for the deployment. In a meanwhile, the sites quantity grew to 80+. That’s why we decided to parallelize the deployment into several streams to make it time-efficient.

Step 3. 80+ websites: Mina-multideploy

Finally, we created our own extension called Mina-multideploy- a useful tool for parallel deployment. This tool implements Mina on multiple servers.

Currently, deployment is executed in several streams with the help of Parallel gem. This allows reaching a huge amount of 89 websites in parallel.

As a result, the whole platform deployment takes approximately 10 minutes that seems to be a good performance.

Let’s take a look at some numbers that show how we managed to shorten the average deployment time.

Adding new websites

Challenge: Design an easy process of websites addition

The platform is constantly growing, so it means we deal with adding a new site very often.

However, when you need to add/delete a single site — it is not so time-consuming as making the same to over eighty sites.

So our main challenge was to design a process that would allow us quickly adding new websites when needed. There were 2 steps that helped us accomplish this goal.

Step 1. Preparation of a fresh DB

We have created a script that generates a new website database and fills it with starting data.

Step 2. Create an easy website addition process

The next step was to simplify this process, that’s why we have created a Rails generator — a script containing templates that generates a new website configuration. Using this generator, we saved plenty of time. The new site addition takes only 1–2 hours instead of 1–2 days.

The script generates:

  • Locale files
  • Configs (for example newsite.yml)
  • CSS files with automatic colour matching based on logo colours
  • Puts images to the right folders structure
  • A watermark by repainting the logo into white colour.

Now we need only to adjust third-party services like AWS, Mailchimp, and add the website URL to the Facebook whitelist.

In case we need to delete some website (i.e. we have generated a site with incorrect data), then we should run rails destroy site *hostname*.

Monitoring

Challenge: Implement automated monitoring

It is important to monitor such a big platform working capacity and check it for bugs as a number of websites is quite large. The main challenge for our team was to organize monitoring in the most time-efficient way. We used to manually check the websites or monitor them daily.

With the platform growth, our developers decided to use Uptime Robot — an automated tool which provides us with 4 monitoring types:

  • HTTP(S): that’s perfect for website monitoring. The service regularly sends requests (which are the same as if a visitor was browsing your website) to the URL and decides if it is up or down depending on the HTTP statuses returned from the website (200-success, 404-not found, etc.).
  • ping: this is good for monitoring a server. Ping (ICMP) requests are sent and up/down status is decided according to if responses are received or not. Ping is not a good fit for monitoring websites, as a website (its IP) can respond to ping requests while it is down (which means that the site is down but the server hosting the site is up).
  • keyword: checks if a keyword exists on a web page.
  • port: good for monitoring services like SMTP, DNS, POP, as all these services run from a specific port and Uptime Robot decides their statuses if they respond to the requests or not.

There are many options to receive notifications from Uptime Robot.

Since we use Slack for work, we have adjusted alerts directed to Slack chat. The developer responsible for the platform support receives a notification immediately if any website is down:

So, our team is always up-to-date and ready to quickly solve any issue.

To summarize

To sum up, a multisite setup is a great solution for entrepreneurs aiming to build a large marketplace which should cover numerous regions.

Such a solution would be much time- and cost-effective as you are able to manage all the sites smoothly and easily. The configuration system allows improving a single website and gaining the results across the whole platform.

We are proud of the extensive platform that our team has created:

  • 89 local vehicle marketplaces successfully operating (and the number is still growing!)
  • 15 languages and 72 currencies are implemented for localization
  • Mobile-first approach implemented
  • Regular server response is 100–150 ms
  • Average page loads within 2–3 seconds
  • High stability and security of the platform
  • Reduced effort — our developers have found or created tools for fast and easy localization, configuration, testing, deployment, monitoring, and adding new websites.

Learn more about this project details in the marketplace case study, and see how we created a simple and user-friendly design on our Behance profile.

We hope the project details and the tools that we used for the marketplace development will help you with your product creation.

Apart from the vehicle marketplace, Codica team has developed other great solutions for marketplaces in such areas as finance, travelling, and e-commerce.

Have an idea for a marketplace? We would love to bring in to life. Let’s get in touch!

Originally published at www.codica.com.

--

--

Codica Team
Codica Journal

Software development consultancy. We are passionate about innovations and create great online marketplaces with Ruby on Rails, React, Vue and Angular.