Migrate from Azure SQL Database using BCP
Although most of the people want to migrate from SQL Server to Azure SQL Database (Managed Instance or Single Database), there are always the questions how to go back and get out of Azure to on-premises if this is necessary. There are several options for migration from Azure SQL Database to on-premises SQL Server:
- Transactional replication can copy data from your Managed Instance to any other SQL Server. I believe that this is the most convenient approach for migrating away from Managed Instance. This approach is not available if you are migrating from Azure SQL Database-Singleton, and can be used only on Managed Instance.
- BCP in/out is a tool that exports tables to files so you can import them. Under the hood it uses the similar approach as initial snapshot of Transactional Replication, and it is applicable both on Managed Instance and Single Database. Also, you can use this approach to migrate from Azure SQL Database-Singleton to Managed Instance (and vice versa).
- Backup/restore (Managed Instance only)— officially backup/restore from Managed Instance to on-premises SQL Server don’t work because Managed Instance is always the latest version of database engine and you cannot restore the backups from the higher version to earlier version. However, if you download the latest CTP version of SQL Server there is a high chance that it will be on the same version as Managed Instance. If you are planning to put your databases on the latest version of SQL Server that will come in the future (SQL Server 2019 in the time of writing this article), backup/restore might work for you.
In this article, I will explain how to export all table to the files using BCP command and import them into your on-premises SQL Server database. This is platform independent approach because BCP can always transfer content between tables and files (even plain CSV files that can be opened using Excel) regardless of the Database Engine version
BCP is command-line tool that can export a table to a file or import a content of a file into a table. The following example exports the table Warehouse.Colors from WideWorldImporters database into Warehouse_Colors.bcp file:
bcp WideWorldImporters.Warehouse.Colors out C:\temp\bcp\Warehouse_Colors.bcp -S"sabanbajramovic-gp-01-pilot.wcus17662feb9ce98.database.windows.net" -n -U"***" -P"***"
In the following sections you will see how to leverage this tool to export and import data during migration.
Generate script for data export/import
Although you could write the BCP command above for every table in your database, it would be much easier to generate the BCP commands using the following script that is running in SqlCmd mode in SQL Server Management Studio (just replace username/password and connection info in the variables):
:setvar path "C:\temp\bcp\" :setvar username cloudAdmin :setvar password MyMiPassword:setvar target_server "MDCS-JOVANPOP10" :setvar target_username sqlServerAdmin :setvar target_password MySqlServerPassworduse WideWorldImporters;select export = concat('bcp ', DB_NAME(), '.', SCHEMA_NAME(schema_id), '.', name, ' out $(path)', SCHEMA_NAME(schema_id), '_', name, '.bcp ', ' -S"', @@servername,'"', ' -n -U"$(username)" -P"$(password)"') from sys.tables where type = 'U' and is_ms_shipped = 0 and temporal_type in (0,2);select import = concat('bcp ', DB_NAME(), '.', SCHEMA_NAME(schema_id), '.', name, ' in $(path)', SCHEMA_NAME(schema_id), '_', name, '.bcp ', ' -S"$(target_server)','"', ' -n -U"$(target_username)" -P"$(target_password)"') from sys.tables where type = 'U' and is_ms_shipped = 0 and temporal_type in (0,2);
For the simplicity reasons I’m not exporting temporal history tables because I would need to break the history relationships between current and history tables on the target database, then BCP in data and then establish the history relationships again. Not too complicated, but it is digression from the main topic in this article. I’m also excluding ms shipped objects. You can add some other criterion for selecting objects (for example export only tables from some schemas). Once you run this script you will get the results like the one shown on the following picture:
In the result you will get two set of BCP commands — one that exports all tables from your database, and the other that gets every exported file and imports it into your target database.
Once you copy and paste all commands from the first result set you will get all tables exported in a folder:
HINT: if you want parallel export, you can change the script to run start bcp:
start bcp WideWorldImporters.Warehouse.Colors out C:\temp\bcp\Warehouse_Colors.bcp -S"sabanbajramovic-gp-01-pilot.wcus17662feb9ce98.database.windows.net" -n -U"***" -P"***"
Preparing the database
Now you need to create and prepare your target database where you would import the files. There are several ways to prepare database:
- Script all objects in database using SQL Server Management Studio and run the script on your target SQL Server instance. This approach is faster; however, you might need to slightly alter scripts because the script generate on Azure SQL Database might not be fully compatible. One example is ADD FILE T-SQL command that on Managed Instance don’t accept file path and this is required on SQL Server. Differences are not big and you should be able to resolve them easily. Make sure that you resolve them because due to one issue with file in USERDATA filegroup in WideWorldImporters database I was unable to load the files.
- Extract Data-tier application (.dacpac file) using SSMS, SSDT, or SqlPackage that contains the structure of your database and object, and then import Data-tier application. Import/Export process resolves all differences in syntax — this would be my preferred approach.
Once you create your database with the matching structure, you would need to disable all table constraints:
sp_msforeachtable "ALTER TABLE ? NOCHECK CONSTRAINT all"
The reason for this step is that the generated script for importing tables do not use some kind of topological order based on foreign key relationships where parent tables are imported before child tables. You can always re-arrange BCP commands in the generated export commands, but if have a lot of commands this could be tedious task. It is easier to disable all constraints and enable them again once you finish the load.
Now you can run the second set of export commands to import data from the folder to a target database.
If you successfully created target table with the same structure this process would be straightforward, and all your tables will be imported.
However, in some cases you might get the errors like:
This error happens due to the QUOTED_IDENTIFIER option that breaks the load/insert. In order to resolve this issue, you can add -q flag in your failed BCP command and load will succeed:
bcp WideWorldImporters.Application.People in C:\temp\bcp\Application_People.bcp -S"MDCS-JOVANPOP10" -n -U"sa" -P"***" -q
In WideWorldImporters database I had to change this line for Application.People and Sales.Invoices tables.
As a final step, you should enable all constraints in your target database:
EXEC sp_msforeachtable "ALTER TABLE ? WITH CHECK CHECK CONSTRAINT all"
If you have successfully imported all tables (and fixed the potential issues with some bulk import commands), this command will be executed without any error/warning and you will have the copy of your database on your on-premises SQL Server.
BCP is a tool that can help you to migrate your data out of the Azure SQL Database into the SQL Server or even other Azure SQL Database (Singleton or Managed Instance). This is a cross-platform tool because it just exports the data to file and import the same file to the table with the matching schema. This is fast way to migrate your data and relatively easy to setup if you have these scripts.
If you are on Managed Instance, in most of the cases I would prefer Transactional Replication instead of this approach because it uses the same approach for migrating data during initial snapshot, and if you know how to setup and monitor Transactional Replication this would be better option.
I would prefer the approach instead of Transactional Replication in the cases where you expect the issues with initial snapshot that you want to control. If any table import/export fails due to connection break or stuck it might be harder to fix this in Transactional Replication. Here, you can partially export tables and retry the failed ones. Also, if you have huge tables, with BCP you can partially export tables and import parts into the destination. Generally, BCP approach is good if you are not familiar with Transaction replication or you need to have the full control over export/import.