TFS to Azure DevOps Services

Vinay Sandela
6 min readJul 12, 2023

--

The data migration tool for Azure DevOps provides a high fidelity way to migrate collection databases from Azure DevOps Server to Azure DevOps Services.

The workflow looks as below —

Validation > Preparation > DACPAC Generation > Importing

Pre-Requisites:

Download the Migration Tool and Guide from Microsoft Docs

The data migration tool for Azure DevOps supports the two latest releases of Azure DevOps Server. So as on today the supported versions are —

  • Azure DevOps Server 2022.0.1
  • Azure DevOps Server 2022
  • Azure DevOps Server 2020.1.2

Why Migration?

Some of the benefits that you get from migration —

  • Getting automatic updates from Microsoft
  • Optimizing financial costs and reducing operational time
  • Complying to certifications (ISO 27001, SOC 1, SOC 2 and HIPAA BAA)
  • Accessing the platform from to Azure DevOps from anywhere and on any device
  • Getting a 99.9% uptime guarantee
  • Support

Download Migration Tool

Download the Migration Tool on the TFS Application Server. Unzip it. It can be downloaded and used on other server too, but there needs to be connectivity with the database server.

On the command line, you do have help and guidance for the tool.

./Migrator /help

Validate Collection

./Migrator validate /collection:<<TFS Instance>>/<<Collection Name>> /tenantDomainName:<<Domain Name>>

If you are running the validation from other server than Application server, include the Connection Strings Parameter

./Migrator.exe validate /collection:<<TFS Instance>>/<<Collection Name>>/tenantDomainName:<<Domain Name>>/connectionString:"Data Source=<<Database Name>>;Initial Catalog=Configuration;Integrated Security=True"

Check the logs after validation is completed. Fix for any errors. Most common errors are when there are custom process templates with custom fields or states.

If the collection is huge with the largest table size is more than 33GBs, there is another approach to be followed for import. It needs a SQL Azure VM Method.

If the validation is passed, move to next step.

Prepare Collection

The prepare command assists with generating the required import files. Essentially, this command scans the collection to find a list of all users to populate the identity map log, IdentityMapLog.csv, and then tries to connect to Azure AD to find each identity's match.

./Migrator.exe prepare /collection:{collection URL} /tenantDomainName:{name} /region:{region}

When the data migration tool runs the prepare command, it runs a complete validation to ensure that nothing has changed with your collection since the last full validation. Sign-In when prompted with the Azure AD details.

When you run the prepare command successfully in the data migration tool, the results window displays a set of logs and two import files. In the log directory, you'll find a logs folder and two files:

  • import.json is the import specification file. We recommend that you take time to fill it out. This file helps with all the required details on the Collection that needs to be imported. It defines the location of the DACPAC, the type of Import, what's the Target name would be etc.
  • IdentityMapLog.csv contains the generated mapping of Active Directory to Azure AD identities. Review it for completeness before you kick off an import. This would help mapping the permissions to different groups.

Backup TFS Collection

Stop the Collection and Take a Full Backup before proceeding the migration. This is helpful to restore if needed.

Generate DACPAC

DACPACs offer a fast and relatively easy method for moving collections into Azure DevOps Services. A DACPAC file that packages the collection database to be used to bring in the data during the import.

Stop the collection and detach it from TFS before generating the DACPAC. The collection size limit for the DACPAC method is 150 GB. If its more than that, follow the SQL Azure VM Method

DACPAC is a feature of SQL server that allows database changes to be packaged into a single file and deployed to other instances of SQL. Multiple versions of the SqlPackage.exe tool are installed with SSDT. The versions are stored in folders with names such as 120, 130, and 140. When you use SqlPackage.exe, it’s important to use the right version to prepare the DACPAC.

SqlPackage.exe can be found on the SQL Servers at C:\Program Files (x86)\Microsoft Visual. Studio\2017\SQL\Common7\IDE\Extensions\Microsoft\SQLDB\DAC\130\

Below command helps in generating the DACPAC. Stop the Collection and Detach it before generating the DACPAC. Provide the SQL Server Names and the Collection Database Name. Make sure you got enough storage for the DACPAC file location.

.\SqlPackage.exe /sourceconnectionstring:"Data Source=<<SQL Server Name>>;Initial Catalog= Tfs_<<Collection Name>>;Integrated Security=True" /targetFile:"I:\DACPAC\<<CCollection Name>>.dacpac" /action:extract /p:ExtractAllTableData=true /p:IgnoreUserLoginMappings=true /p:IgnorePermissions=true /p:Storage=Memory /p:CommandTimeout=0
  • Data Source: The SQL Server instance that hosts your Azure DevOps Server collection database.
  • Initial Catalog: The name of the collection database.
  • targetFile: The location on the disk and the DACPAC file name.

Upload DACPAC

DACPAC must be placed in an Azure storage container. This can be an existing container or one created specifically for your migration effort. It’s important to ensure that your container is created in the right region. Create a blob container from the Azure portal. After you’ve created the container, you need to upload the Collection DACPAC file. You can use tools such as AzCopy or any other Azure storage explorer tool, such as Azure Storage Explorer.

Azure Storage Explorer:

  • Connect to your Azure Subscription
  • Its lists out all the Storage Accounts in your Subscription.
  • Upload the DACPAC in one of the Blob Storage. Upload time depends on the size of the DACPAC.
  • Generate SAS Token for the Blob Container
  • Select Read and List as permissions with expiry time of more than 7 days.
  • Copy the URL and save it for using it on Import. Json.

Update Import.Json

With all the required details completed, Update the Import.Json thats generated during the /Prepare command.

Update the SAS Key at the Location.

Enter the DACPAC Name that's uploaded at Dacpac section

Target Name — This is the desired Azure DevOps Organization Name.

Import Type — This is the type of Import- Dry Run or ProductionRun.

  • DryRun: Used for test purposes., used for validation purpose. Its limited existence and are automatically deleted after a set period of time.
  • ProductionRun — This is Prod Run after validation is completed on dryrun. ProductionRun can be done only ones for a TFS Collection.

Its always recommend that you complete a dry-run import first.

Import to Azure DevOps Services

You start an import by using the data migration tool’s import command. The import command takes an import specification file as input. It parses the file to ensure that the provided values are valid and, if successful, it queues an import to Azure DevOps Services.

.\Migrator.exe import /importFile:"<<Import.JSON file location>>\import.json"

Once the Migration Starts, you’d get an email that the Import Process is started and it runs by stages —

Large Collections:

For Larger Collections, Configure a Azure SQL VM with all the required configuration.

Restore the Full Backup of the TFS collection on this Azure VM.

ALTER DATABASE [<Database name>] SET RECOVERY SIMPLE;
USE [<database name>]
CREATE LOGIN <pick a username> WITH PASSWORD = '<pick a password>'
CREATE USER <username> FOR LOGIN <username> WITH DEFAULT_SCHEMA=[dbo]
EXEC sp_addrolemember @rolename='TFSEXECROLE', @membername='<username>'

Copy the Folder after the /Prepare command to this Azure SQL VM.

Modify the Import.Json > Remove the DACPAC Parameter and Update the Connection String

"Properties":
{
"ConnectionString": "Data Source={SQL Azure VM Public IP};Initial Catalog={Database Name};Integrated Security=False;User ID={SQL Login Username};Password={SQL Login Password};Encrypt=True;TrustServerCertificate=True"
}

Proceed with the Import Command

.\Migrator.exe import /importFile:"<<Import.JSON file location>>\import.json"

Sources:

https://learn.microsoft.com/en-us/azure/devops/migrate/migration-import?view=azure-devops#validate-a-collection

--

--