How to backup our MongoDB and push the file into AWS S3 in Ubuntu
In some projects we use MongoDB to handle the data and since data is very important we were looking for a way to backup it and push it in another location to protect the info.
One of the options is to use a third party service to hosting the database and to handle the backups stuff. Another way is to do it by ourselves in our server, this article is focused in how to do the second option.
Obviously we need MongoDB. We can found the steps in the official docs the following is a summary of those steps:
sudo apt-key adv --keyserver hkp://keyserver.ubuntu.com:80 --recv 9DA31620334BD75D9DCB49F368818C72E52529D4echo "deb [ arch=amd64 ] https://repo.mongodb.org/apt/ubuntu bionic/mongodb-org/4.0 multiverse" | sudo tee /etc/apt/sources.list.d/mongodb-org-4.0.listsudo apt-get update
sudo apt-get install -y mongodb-org
Also, we need AWS client installed in the server, once again the steps are located in the official docs. The only step (without taking into account pre-requirements as python and/or pip) is:
pip install awscli --upgrade --user
We need to configure AWS client, detailed info is located here. The easy way is running the following command:
AWS Access Key ID [None]: KEY
AWS Secret Access Key [None]: SECRET
Default region name [None]: region
Default output format [None]: json
OK, I have the requirements, now, how can I generate the backup?
It is a piece of cake, the following command will backup all our MongoDB databases and will save them in
mongodump --out ~/backups
Detailed info about mongodump options is located here.
Now, we can package the folder for easiness, we can use any software as
tar czf db_backup.tar.gz ~/backups
So… How can I push it into AWS S3?
The following command does the magic:
aws s3 cp db_backup.tar.gz s3://my_bucket/db_backups/
Putting it all together
Now it is matter to do minor changes to join all the steps in a script to be executed automatically every day.
The first change is to set a unique filename to backup file, it will allow us to save multiple backups without overriding anything, we can use current date and time to do it:
and then, we should generate the script file, we can put it at
This is an example for the content of the file:
mongodump --out $backup_name
tar czf $backup_name.tar.gz $backup_name
aws s3 cp $backup_name.tar.gz s3://my_bucket/db_backups/
rm -rf $backup_name
The last step is to configure cron to run the script every day, we should first open cron file:
And add something like the following in the file:
0 0 * * * /bin/bash ~/bin/db-backups.sh >> ~/logs/db_backups.log 2>&1
The previous line asks cron to run the script every day at midnight.
To test it, we can run by hand the command:
Also we can wait to the next day and verify in our Amazon Web Services account that the file was created.
As you can see, it is an easy process, we should follow some steps and then we will have our backups in an external service. Obviously, a good practice is to verify that backups are working doing some test in a local environment or any other server to verify that backups can be restored to be able to use them when we have a real contingency in production servers.
If you like this story clap as many times as you want, and to see similar stories about technology, check our publications and leave us a comment if you have any question.
If you need a team that can help you to implement your ideas in React JS or React Native, feel free to ping us using our website and filling the contact form.