My journey from NoSQL to SQL

Meenakshi Kumari
4 min readNov 21, 2018

My project is an B2B(business-to-business) website, where vendors can sell their products to shopkeepers directly or with the help of agents and vice-versa. When I started with this project, it had MongoDB database(NoSQL) along with Rails(v4.1.1) framework written in Ruby(v2.1.0).

MongoDB is a fast NoSQL database. Unfortunately, it’s wasn’t the cure for all the performance troubles. Many issues regarding our site unavailability were reported which was caused due to slow querying on associations and indexes by MongoDB. One particular case was: we were exporting reports in our site, which was retrieving data from many associated and embedded documents in database, which was very slow process. Data update and create tasks were taking more time because of complex transactions over highly associated data. MongoDB is not ACID compliance , consistency and availability are incompatible in Mongo due to the CAP theorem.(tip: MongoDB ACID compiliance, NoSQL vs SQL)

So we wanted to switch to database which had transaction support and is ACID compliance. To enhance our website speed and availability, our team decided to migrate project database to PostgreSQL. It is an object-relational database management system (ORDBMS) with an emphasis on extensibility and also supported NoSQL features. Along with database migration we also upgraded our Ruby on Rails versions.

After this successful migration, our site availability and resilience improved as PostgreSQL performed much better for indexes and joins and our service became faster and snappier as a result. And also our database size reduced since it stores information more efficiently.

I’ll be sharing my experience of project migration in following series of blogs:

  1. Gem changes and Preparation of schema for PostgreSQL database.
  2. Data transfer from MongoDB to PostgreSQL database, without any inconsistency.
  3. Problems faced before launching updated project.

In this blog i’ll be explaining how to update Gemfile and also the schema designing for PostgreSQL from the MongoDB.

So come along with me on my journey of this migration.

Gem changes and Preparation of schema for PostgreSQL database.

NOTE: We had two separate branches for the MongoDB and PostgreSQL code in the same GitHub project repository.

We have to update our project Gemfile to PostgreSQL by replacing all mongo related gems by pg gems, for eg:

  • mongoid with pg
  • mongoid_tree with ltree_hierarchy
  • mongoid_search with pg_search
  • mongoid_observers with rails_observers
  • mongoid_audit with audited
  • carrierwave_mongoid with carrierwave

Next step was, preparation of a schema for our PostgreSQL database from the MongoDB collection. Replace mongoid.yml file with database.yml file and create database using rake db:create command. We have to make several changes in the data type, relations, etc in PostgreSQL database, some of them are as follows:

  1. Symbol type field of MongoDB document was changed to string and while retrieving the data from DB it was to be converted to_sym explicitly.
  2. MongoDB has ‘embeds_one ’, ‘embeds_many’ relation which was converted to ‘has_one’ ‘has_many’’ relation in PostgreSQL. For example:
##### MONGODB CODE #####
class Company
embeds_many :addresses
end
##### POSTGRESQL CODE #####
class Company < ApplicationRecord
has_many :addresses, dependent: :destroy
end

3. For has_and_belongs_to relation, a third table in schema was created in PostgreSQL, where both the tables ids was stored. For example:

##### MONGODB CODE #####
class Company
has_and_belongs_to_many :users
end
class User
has_and_belongs_to_many :companies
end
##### POSTGRESQL CODE #####
class Company < ApplicationRecord
has_and_belongs_to_many :users, association_foreign_key: 'user_id',
join_table: 'companies_users'
end
class User < ApplicationRecord
has_and_belongs_to_many :companies, join_table: 'companies_users'
end
##### POSTGRESQL SCHEMA FOR COMPANIES_USERS TABLE #####
create_table "companies_users", force: :cascade do |t|
t.bigint "company_id"
t.bigint "user_id"
t.index ["company_id"], name: "index_companies_users_on_company_id"
t.index ["user_id"], name: "index_companies_users_on_user_id"
end

NOTE: Primary key default was changed from Integer to BIGINT for PostgreSQL from rails 5.1.

TIP: To check that correct data is imported from mongodb to PostgreSQL, we stored the mongo_id of the imported mongo record in a string field named mongo_id and mongo_ids of all the associated tables of that record, in field called relation_ids which is of type hstore. So if anything goes wrong while data transfer task it can be handled while checking these fields and also for cross checking consistency of the data records which are transferred from MongoDB to PostgreSQL. Both of these records are for future reference and they can be removed later when you are sure about the imported data.

Likewise we had to write migrations for each table in order to prepare our schema. Sample example for migration and corresponding schema table of Address is as follows:

##### MONGODB MODEL #####
class Address
field :flat_no, type: Integer
field :pincode, type: Symbol
field :city, type: String
field :state, type: String, default: ‘’
belongs_to :company
end
##### POSTGRESQL MIGRATION #####
class CreateAddresses < ActiveRecord::Migration[5.2]
def change
create_table :addresses do |t|
t.integer :flat_no
t.string :pincode
t.string :city
t.string :state , default: ‘’
## newly introduced fields ##
t.string :mongo_id, default: ‘’
t.hstore :relation_ids, default: {}
t.timestamps t.belongs_to :company, index: true
end
end
end

Now our schema was ready and we were all geared up for next phase for this procedure which was data transfer from MongoDB to PostgreSQL database which i’ll explain in my next part of the blog.

Do read my next blog here.

--

--