Terraform and Azure DevOps

DataFairy
3 min readAug 13, 2023

--

How to deploy your Terraform infrastructure code with Azure DevOps tasks using a service principal.

Terraform

Terraform is an open-source infrastructure as code software tool that provides a consistent CLI workflow to manage hundreds of cloud services. Terraform codifies cloud APIs into declarative configuration files.

Azure DevOps

Azure DevOps supports a collaborative culture and set of processes that bring together developers, project managers, and contributors to develop software. It allows organizations to create and improve products at a faster pace than they can with traditional software development approaches.

Azure DevOps combines a couple of feature including DevOps and Scrum tools.

Prerequisites:

  • Azure DevOps project
  • Service principal/connection with the correct authorization
  • Storage account
  • File path for terraform.tfstate
  • Terraform configuration files

When you don’t have the proper authorization to deploy resources you might want to take advantage of a service connection based on a service principal. You can create it yourself (with the proper authorization) or have the IT team provide it for your project. With this service connection you can assume another identity that has elevated rights in Azure and is authorized to deploy resources.

To use this service principal in Azure DevOps you need to create a service connection and reference it in an Azure DevOps pipeline.

Once this is set up you want to use the following code to run the Terraform process: Init-Write-Plan-Apply.

In this case the terraform state file is stored in a storage account you should have access to and that is shared within your project. This way your entire team can use the state file in future deployments.

In the following code example the code is in a directory called Terraform.

Here is the code:

# Terraform deployment example
# Parameters are Service Connection, Resource Group, Storage Account, Storage Account Container, Terraform State File


trigger:
- None

pool:
vmImage: 'ubuntu-latest'

steps:

# Equivalent to terraform init but with a backend defined
- task: TerraformTaskV2@2
inputs:
provider: 'azurerm'
command: 'init'
workingDirectory: '$(System.DefaultWorkingDirectory)/Terraform'
backendServiceArm: 'Service-Principal-Based-ServiceConnection'
backendAzureRmResourceGroupName: 'infra-resource-group'
backendAzureRmStorageAccountName: 'infrastorageaccount'
backendAzureRmContainerName: 'terraform'
backendAzureRmKey: 'tf/terraform.tfstate'

# Equivalent to terraform plan -out main.tfplan
- task: TerraformTaskV2@2
inputs:
provider: 'azurerm'
command: 'plan'
workingDirectory: '$(System.DefaultWorkingDirectory)/Terraform'
commandOptions: -out 'main.tfplan' -var="subscriptionId=$(subscriptionId)" -var="service_principal_id=$(service_principal_id)"
environmentServiceNameAzureRM: 'Service-Principal-Based-ServiceConnection'

# Equivalent to terraform apply
- task: TerraformTaskV2@2
inputs:
provider: 'azurerm'
command: 'apply'
workingDirectory: '$(System.DefaultWorkingDirectory)/Terraform'
commandOptions: -var="subscriptionId=$(subscriptionId)" -var="service_principal_id=$(service_principal_id)"
environmentServiceNameAzureRM: 'Service-Principal-Based-ServiceConnection'

# Equivalend to terraform apply -destroy.
# You might not want to do this last part right away
- task: TerraformTaskV2@2
inputs:
provider: 'azurerm'
command: 'destroy'
workingDirectory: '$(System.DefaultWorkingDirectory)/Terraform'
commandOptions: -var="subscriptionId=$(subscriptionId)" -var="service_principal_id=$(service_principal_id)"
environmentServiceNameAzureRM: 'Service-Principal-Based-ServiceConnection'

This code should be the base for the Azure DevOps pipeline. You could trigger it whenever there is a new update to your infra code.

Github: datafairy-azure/terraform: Terraform scritps to work with Azure (github.com)

If you found this article useful, please follow me.

--

--

DataFairy

Senior Data Engineer, Azure Warrior, PhD in Theoretical Physics, The Netherlands. I write about Data Engineering, Machine Learning and DevOps on Azure.