Custom Log Shipping using Microsoft SQL Server, Google Drive, and Python (Part 1: How to Use It)

Katarina Nimas Kusumawati
4 min readDec 17, 2021

--

Log shipping is the process of backing up transactional log files from a server into a shared folder/place that will be distributed to other servers. In this process, the database that is restored to the secondary server status will be Standby/Read-Only and can still be accessed even though it only has Read access.

In this case, I will show you how to use simple log shipping.

Flowchart

This is my custom log shipping flowchart. So the log will be shipped automatically with the task scheduler with the time that we have decided. If the schedule in the backup script is triggered (it can be by time or whatever you want to custom the trigger) then the log will be sent into google drive (.bak) and the name of the log will be saved on the website. So the user can know the name of the backup log through the website. For restore, it will be triggered by the task scheduler on the secondary server side. Restore script can download the log from google drive, then get the name of the log from the website, and restore it into the secondary server database.

How to use it

So first we have a database. This database needs to be backup first. For the initial phase, backup and restore are run using Azure Data Studio. Or you can run directly on the Microsoft SQL Server. This is just an example of a query that you can use it.

I’ll show you how to connect it with Azure Data Studio (since it’s a little bit tricky for beginners like me. You can escape it if you already knew).

Connect Azure Data Studio

Go to SQL Server Configuration Manager. Choose the secondary server and on TCP/IP, choose enabled.

Open the IP Address tab, then adjust the IPAII > TCP Dynamic Ports & TCP Port. You can use this or configure it using your own port.

Now for the primary server, go to the IP Address tab then IPAII > TCP Dynamic Ports & TCP Port and adjust the port. After that, restart the primary & secondary server.

Create a new connection for primary server

Open Azure Data Studio and create a new connection. The server that we will use is 127.0.0.1. Use SQL Login for authentication type. Write your primary database username and the password (better you click remember password too). Then connect it.

Create a new connection for secondary server

In this part, click new connection and go to Advanced. Set the port, using the secondary server’s port that you have set in SQL Server Configuration. Login with the same fill like a primary server with a different name then click Connect.

Now try to Backup and restore the data using this query.

Query backup

BACKUP DATABASE [database] TO DISK = N’C:\kaniku_backup\programming-in-db.bak’

Query Restore

RESTORE DATABASE [database] FROM DISK = N’C:\kaniku_backup\programming-in-db.bak’
WITH MOVE ‘programming-in-db’
TO ‘C:\Program Files\Microsoft SQL Server\MSSQL15.SQLEXPRESS\MSSQL\DATA\database.mdf’,
MOVE ‘programming-in-db_log’
TO ‘C:\Program Files\Microsoft SQL Server\MSSQL15.SQLEXPRESS\MSSQL\DATA\database.ldf’,
STANDBY= ‘C:\Program Files\Microsoft SQL Server\MSSQL15.SQLEXPRESS\MSSQL\DATA\database_rollback_undo.bak’

Do you understand what does the code mean? It means that the secondary server just standby. You just can read it.

So now we try to do log shipping!

Log Shipping

We run the backup log in the backup script. The program that I used to run is Anaconda3. I run it using a task scheduler, so the task scheduler will trigger the backup script to do a backup log.

When you try to run the backup script, you will be directed to the default browser for authentication. If you see the warning page, click advanced and open the project.

After the script is run, the name of the log file would be saved into MongoDB. MongoDB is NoSQL database. Now, the user can read what the name of the log file via the website. I just found a platform to make a website for you that is familiar with python! It is Streamlit.

To check the website, run this on Anaconda Prompt. Open Environment in Anaconda, and click the play button then Open terminal.

Use this command

Streamlit run web_ui.py

Note: web_ui.py is a website script.

So the website is for users to check the log. Then task scheduler in the secondary server will run the restore script in the time that has been set.

So that is the flow of the simple custom log shipping. Go check this for a complete story!

Custom Log Shipping using Microsoft SQL Server, Google Drive, and Python (Part 2: How to Install It)

Custom Log Shipping using Microsoft SQL Server, Google Drive, and Python (Part 3: How to Build It)

Video How to Use It (Indonesia)

--

--

Katarina Nimas Kusumawati

Sometimes I struggle with data, sometimes I just wanna be a Pikachu