Terraform Plan With Azure DevOps YAML Pipelines — Part 2

Matt Mencel
4 min readJan 7, 2019

--

[EDIT (2/13/2019): Since originally publishing this article I have changed my approach slightly. Instead of depending on the third-party plugin to run Terraform, I am implementing the commands myself. I’ve found I have better success that way.]

Photo by Ben Warren on Unsplash

Terraform Plan YAML pipeline steps in Azure DevOps

In Part 1 I described the basics of the YAML schema and described the azure-pipelines.yml used in the Terraform directory. Now I’ll introduce the template file that does the actual Terraform Plan.

I’m following the DRY (Don’t Repeat Yourself) principle by creating YAML pipeline template files in a central repository that can be included in any other YAML file by using the resource attribute. This is where the Terraform Plan template is located.

Third Party Plugins

If you use third-party plugins in your YAML pipelines, you’ll likely first have to create and configure an example pipeline using that plugin in the UI and use the View YAML link to find the syntax for using that plugin in your YAML pipeline.

Terraform Plan Template

The terraform plan template is fairly short as you can see. I’ll break down each task below.

Terraform Tool Installer

- task: JamiePhillips.Terraform.TerraformTool.TerraformTool@0
displayName: 'Use Terraform $(terraform.version)'
inputs:
version: '$(terraform.version)'

Terraform Tool Installer is a third party Azure DevOps plugin that ensures the version of Terraform wanted for the builds is installed to the tool cache on the agent. The $(terraform.version) variable is declared in the AzDO Variable Group so there is a consistent version of Terraform running across all builds.

AZ Login and Set ACCESS_KEY

- script: |
az login --service-principal -u $(SPN-ID) -p $(SPN-SECRET) --tenant $(TENANT-ID)
ACCESS_KEY=`az storage account keys list -n $(STORAGE_ACCT) -o json | jq -r '.[0].value'`
echo "##vso[task.setvariable variable=ACCESS_KEY]$ACCESS_KEY"
displayName: 'AZ Login and Set ACCESS_KEY'

This script task does an az login using the service principal, and then grabs the primary access key from the storage account that contains the state file, stores it in an environment variable, and finally creates an AzDO variable from it so we can use it in later tasks. The variables used in this task are stored in an Azure KeyVault backed Variable Group.

Create tfvars File

- script: |
cat << EOT >> terraform.tfvars
access_key = "$(ACCESS_KEY)"
tenant_id = "$(TENANT-ID)"
subscription_id = "$(SUBSCRIPTION-ID)"
client_id = "$(SPN-ID)"
client_secret = "$(SPN-SECRET)"
EOT
workingDirectory: '$(terraform.path)'
displayName: 'Create terraform.tfvars'

This script task generates the tfvars file with the secret variables needed for Terraform. The variables used in this task are stored in an Azure KeyVault backed Variable Group.

Terraform Validate

- script: |
terraform validate -check-variables=false
workingDirectory: '$(terraform.path)'
displayName: 'Terraform Validate'

A script task to run terraform validate to check for errors in the Terraform code.

Git URL Redirect for Modules

- script: |
git config --global --list | grep url. | awk -F '.instead' '{print $1}' | while read line
do
echo $line
git config --global --remove-section $line
done git config --global url."https://ado:$(System.AccessToken)@foo".insteadOf https://foo
git init
displayName: 'Redirect foo git URLs to use the Access Token So Modules Can Be Pulled In'

Pipelines cannot source terraform modules from other repositories in Azure DevOps without running into permission problems. To get around this, I take advantage of the git url.<base>.insteadOf feature to rewrite the git URLs with the $(System.AccessToken) variable.

Terraform Plan

- script: |
terraform init -backend-config=resource_group_name=$(RES_GRP) -backend-config=storage_account_name=$(STORAGE_ACCT) -backend-config=container_name=tf-statefiles -backend-config=key=$(state.key) -backend-config=access_key=$(ACCESS_KEY) -no-color -input=false
terraform plan -out=tfplan -no-color -input=false
displayName: 'Terraform Plan'
workingDirectory: '$(terraform.path)'

[EDIT: Rather than use the third party plugin for doing terraform plan, I’ve recently switched to running these tasks natively and have updated this article accordingly.]

In this task, you‘ll see two commands…. terraform init with all the necessary parameters, and terraform plan.

Remove git URL Redirect

- script: |
git config --global --remove-section url.https://ado:$(System.AccessToken)@foo
displayName: 'Remove git URL Redirect'
continueOnError: true
condition: always()

This final task removes the git URL redirect created earlier. Our custom agents could be used for other build or release tasks that may not want to have the git URL redirect in place, so it’s removed here after the plan task has finished.

By setting the continueOnError and condition attributes, we’re telling the pipeline that this task should always run, even if the job is cancelled or a previous task fails.

Publish Terraform Artifact Template

This is the other template I want to address in this post.

Artifact Publishing Delays

One thing I’ve discovered is that the publishing of the Terraform artifact can take a really long time. When you start taking advantage of modules and your .terraform directory starts growing you can eventually have thousands of files, one of mine has over 11K, that need to be uploaded in the artifact publish step. I think there is metadata that AzDO is publishing along with the files and when it has thousands of small files to process it takes a loooong time. 20 or more minutes in some cases.

To combat this you can compress your Terraform directory into a single file, upload that, and then in the release step you just need to uncompress the archive before your terraform apply is run.

Here’s the template I have for this.

The compress step creates a single tar.gz file, in the default build agent directory, from the terraform.path and names it with the state.key variable.

This tar.gz file is then published as the artifact. My example above with 11K files in it that took over 20 minutes to publish originally, now takes no more than 15 seconds!

Conclusion

That’s it. The remaining steps are handled in the main azure-pipelines.yml I described in part 1, or in one of the other templates which I’ll describe in future posts.

Links

You can go back to Part 1 here, where I describe some of the YAML schema being used in more detail.

--

--

Matt Mencel

Cloud Automation Engineer @10thMagnitude. My views are my own.