Custom Log Shipping using Microsoft SQL Server, Google Drive, and Python (Part 3: How to Build It)
So in the last post, I explained how to use and install the custom log shipping. Go check my post before.
Custom Log Shipping using Microsoft SQL Server, Google Drive, and Python (Part 1: How to Use It)
Custom Log Shipping using Microsoft SQL Server, Google Drive, and Python (Part 2: How to Install It)
Log shipping is the process of backing up transactional log files from a server into a shared folder/place that will be distributed to other servers. In this process, the database that is restored to the secondary server status will be Standby/Read-Only and can still be accessed even though it only has Read access.
Now I will explain how to build simple custom log shipping.
In this case, I will show you how to install application for simple log shipping.
Flowchart
This is my custom log shipping flowchart. So the log will be shipped automatically with the task scheduler with the time that we have decided. If the schedule in the backup script is triggered (it can be by time or whatever you want to custom the trigger) then the log will be sent into google drive (.bak) and the name of the log will be saved on the website. So the user can know the name of the backup log through the website. For restore, it will be triggered by the task scheduler on the secondary server side. Restore script can download the log from google drive, then get the name of the log from the website, and restore it into the secondary server database.
How to Build It
Prerequisite:
- Microsoft SQL Express 2019
- Anaconda3 (python 3.8 ver)
- Mongodb 5.0.4
- Robo3T 1.4.4
- Azure Data Studio
- Visual Studio code
- Google drive
1. Build more server
When installed SQL Server Express 2019, the server would be automatically created. In log shipping, we need to move the log from the primary server to the secondary server. The secondary server is a Standby/Read only server. So the server that installed by default let say that’s a primary server. Now let’s build the secondary server.
- Start > open SQL Server Installation Center. Click New SQL Server stand-alone installation or add features to an existing installation.
- The directory would be in C:\SQL2019\Developer_ENU.
- Just click next until you in this page. Choose Perform a new installation of SQL Server 2019.
- Specify a free edition, and click Express, then Next.
- Choose I accept license terms and Privacy Statement. Click next.
- Click Database Engine Services and let all of the fill in the default directory.
- Named the instance, the Instance ID would be automatically filled.
- Change the SQL Server Database Engine Startup Type to Manual. So every time you shut down the server, you should manually enable the Server through SQL Server Configuration.
- In this step, Choose Mixed Mode (SQL Server authentication and Windows authentication). Set your password and confirm the password. Add current user then next until install.
- Secondary server built successfully
2. Get API for Google Drive
- Make Project in Google Cloud Platform
- Create a new project
- Click left panel and search APIs & Services > Library > Google Drive API > Enable
- Open Credentials then select Create Credentials > OAuth client ID (Initial Process)
- Set product name and consent screen by clicking ‘Configure consent screen’. There are no bound rules about how to fill the form but let me guide you on how I filled it.
User Type: External
OAuth consent screen
App name: fill app name
User support email: you can use your email
App logo: you can upload your logo, but this part is optional
Developer contact information: you can use your email
Scope
-
Test Users
-
- Open Credentials then select Create Credentials > OAuth client ID
- Select ‘Application type’ to be Web application
- Enter a name
- Input http://localhost:8080 for ‘Authorized JavaScript origins
- Input http://localhost:8080/ for ‘Authorized redirect Urls’
- Click Save
- Download JSON on the right side of client ID to download client_secret_<really long ID>.json
- Rename the file to “client_secrets.json” and place it in the working directory
- Publish the app
3. Backup script
I use python for backup script
- Import library
These are the library you need to import. Go check my previous post how on to install the library.
- Open connection for SQL Server
For the password, I censored, replace it with your password. For the server, this is a primary server that uses port 1433.
- Make automatically backup
Use a unique identifier for naming the log. I prefer to use the time you try to backup the log as the unique name.
- Close connection
- Save log name into MongoDB
This is the way to save the log name into MongoDB.
- Upload file into Google Drive
You can get the unique ID from google drive folder link
4. Restore script
- Import library
- Download file backup from Google Drive
- Read file name from MongoDB
- Open connection for SQL Server
Used port in secondary server, in this case, the port is 1432
- Do automatically restore
- Close connection
5. Setting up the task scheduler
With task scheduler, the log will automatically backup and restore with the time that we have set
- Create a new task (right panel)
- Fill in name and description. Example: backuptask, restoretask
- Check ‘Run with Highest Privileges’
- In the triggers tab, click New and set it weekly. In this part you need to choose the day when the trigger start
- In the Actions tab, click New and set
Program/script: C:\Users\acer\anaconda3\python.exe
Add arguments (optional): “C:\fp_tbd\backup.py”
(don’t forget the quotation mark)
Do the same with restore script
6. Build Website using Streamlit
Streamlit is a machine learning and data framework to build the website and the script is using Python. Since I have a basis in data science, so this is a good platform that we can use. I’ll show you the main code, but you can adjust it by adding some title or picture by reading the documentation here https://docs.streamlit.io/
So this is the code to retrieve data from MongoDB using streamlit and it will look like this.
I run the script using Anaconda Prompt. So I open my environment that has installed Pymongo and Streamlit (check my story). Then I run using
Streamlit run web_ui.py
This is how the website looks like
So simple right? You can add some text or pictures using your creativity. Since I have lack skill in web UI so I am too shy to share how the final look alike. But this is such a great platform and this is the first time I use streamlit. If I have some time, I’ll try to learn more about streamlit and try to explain some insight using streamlit.
Thank you for reading!