How to pass variables in Azure Pipelines YAML tasks

Passing variables between steps, jobs, and stages: explained

Alessandro Segala
Aug 5, 2019 · 5 min read

This is a quick reference on passing variables between multiple tasks in Azure Pipelines, a popular CI/CD platform. They have recently enabled support for multi-stage pipelines defined in YAML documents, allowing the creation of both build and release (CI and CD) pipelines, in a single azure-pipelines.yaml file. This is very powerful, as it lets developers define their pipelines to continuously build and deploy apps, using a declarative syntax, and storing the YAML document in the same repo as their code, versioned.

One recurrent question is: how do you pass variables around tasks? While passing variables from a step to another within the same job is relatively easy, sharing state and variables with tasks in other jobs or even stages isn’t immediate.

The examples below are about using multi-stage pipelines within YAML documents. I’ll focus on pipelines running on Linux, and all examples show bash scripts. The same concepts would apply to developers working with PowerShell or Batch scripts, although the syntax of the commands will be slightly different. The work below is based on the official documentation, adding some examples and explaining how to pass variables between stages.

Passing variables between tasks in the same job

This is the easiest one. In a script task, you need to print a special value to STDOUT that will be captured by Azure Pipelines to set the variable.

For example, to pass the variable FOO between scripts:

  1. Set the value with the command echo "##vso[task.setvariable variable=FOO]some value"

Full pipeline example:

steps:

# Sets FOO to be "some value" in the script and the next ones
- bash: |
FOO="some value"

echo "##vso[task.setvariable variable=FOO]$FOO"

# Using the $() syntax, the value is replaced inside Azure Pipelines before being submitted to the script task
- bash: |
echo "$(FOO)"


# The same variable is also present as environmental variable in scripts; here the variable expansion happens within bash
- bash: |
echo "$FOO"

You can also use the $(FOO) syntax inside task definitions. For example, these steps copy files to a folder whose name is defined as variable:

pool:
vmImage: 'Ubuntu-16.04'

steps:
- bash: |
echo "##vso[task.setvariable variable=TARGET_FOLDER]$(Pipeline.Workspace)/target"

- task: CopyFiles@2
inputs:
sourceFolder: $(Build.SourcesDirectory)
# Note the use of the variable TARGET_FOLDER
targetFolder: $(TARGET_FOLDER)/myfolder

Wondering why the vso label? That's a legacy identifier from when Azure Pipelines used to be part of Visual Studio Online, before being rebranded Visual Studio Team Services, and finally Azure DevOps!

Passing variables between jobs

Passing variables between jobs in the same stage is a bit more complex, as it requires working with output variables.

Similarly to the example above, to pass the FOO variable:

  1. Make sure you give a name to the job, for example job: firstjob

A full example:

jobs:

- job: firstjob
pool:
vmImage: 'Ubuntu-16.04'
steps:

# Sets FOO to "some value", then mark it as output variable
- bash: |
FOO="some value"

echo "##vso[task.setvariable variable=FOO;isOutput=true]$FOO"
name: mystep

# Show output variable in the same job
- bash: |
echo "$(mystep.FOO)"


- job: secondjob
# Need to explicitly mark the dependency
dependsOn: firstjob
variables:
# Define the variable FOO from the previous job
# Note the use of single quotes!
FOO: $[ dependencies.firstjob.outputs['mystep.FOO'] ]
pool:
vmImage: 'Ubuntu-16.04'
steps:

# The variable is now available for expansion within the job
- bash: |
echo "$(FOO)"


# To send the variable to the script as environmental variable, it needs to be set in the env dictionary
- bash: |
echo "$FOO"

env:
FOO: $(FOO)

Passing variables between stages

At this time, it’s not possible to pass variables between different stages. There is, however, a workaround that involves writing the variable to disk and then passing it as a file, leveraging pipeline artifacts.

To pass the variable FOO from a job to another one in a different stage:

  1. Create a folder that will contain all variables you want to pass; any folder could work, but something like mkdir -p $(Pipeline.Workspace)/variables might be a good idea.

Example:

stages:

- stage: firststage
jobs:

- job: firstjob
pool:
vmImage: 'Ubuntu-16.04'
steps:

# To pass the variable FOO, write it to a file
# While the file name doesn't matter, naming it like the variable and putting it inside the $(Pipeline.Workspace)/variables folder could be a good pattern
- bash: |
FOO="some value"

mkdir -p $(Pipeline.Workspace)/variables
echo "$FOO" > $(Pipeline.Workspace)/variables/FOO

# Publish the folder as pipeline artifact
- publish: $(Pipeline.Workspace)/variables
artifact: variables

- stage: secondstage
jobs:

- job: secondjob
pool:
vmImage: 'Ubuntu-16.04'
steps:

# Download the artifacts
- download: current
artifact: variables

# Read the variable from the file, then expose it in the job
- bash: |
FOO=$(cat $(Pipeline.Workspace)/variables/FOO)

echo "##vso[task.setvariable variable=FOO]$FOO"

# Just like in the first example, we can expand the variable within Azure Pipelines itself
- bash: |
echo "$(FOO)"


# Or we can expand it within bash, reading it as environmental variable
- bash: |
echo "$FOO"

Here’s the pipeline running. Note in the second stage how line #14 shows some value in both bash scripts. However, take a look at the script being executed on line #11: in the first case, the variable was expanded inside Azure Pipelines (so the script became echo "some value"), while in the second one bash is reading an environmental variable (the script remains echo "$FOO").

Image for post
Image for post

If you want to pass more than one variable, you can create multiple files within the $(Pipeline.Workspace)/variables (e.g. for a variable named MYVAR, write it inside $(Pipeline.Workspace)/variables/MYVAR), then read all the variables in the second stage.

Originally published at https://withblue.ink on August 5, 2019.

Microsoft Azure

Any language.

Alessandro Segala

Written by

Cooker of great risotto. Sometimes tech nerd. Driving dev tools, @code & open source @Microsoft @Azure ☁️ Opinions are mine 🇮🇹🇨🇦🇺🇸

Microsoft Azure

Any language. Any platform. Our team is focused on making the world more amazing for developers and IT operations communities with the best that Microsoft Azure can provide. If you want to contribute in this journey with us, contact us at medium@microsoft.com

Alessandro Segala

Written by

Cooker of great risotto. Sometimes tech nerd. Driving dev tools, @code & open source @Microsoft @Azure ☁️ Opinions are mine 🇮🇹🇨🇦🇺🇸

Microsoft Azure

Any language. Any platform. Our team is focused on making the world more amazing for developers and IT operations communities with the best that Microsoft Azure can provide. If you want to contribute in this journey with us, contact us at medium@microsoft.com

Medium is an open platform where 170 million readers come to find insightful and dynamic thinking. Here, expert and undiscovered voices alike dive into the heart of any topic and bring new ideas to the surface. Learn more

Follow the writers, publications, and topics that matter to you, and you’ll see them on your homepage and in your inbox. Explore

If you have a story to tell, knowledge to share, or a perspective to offer — welcome home. It’s easy and free to post your thinking on any topic. Write on Medium

Get the Medium app

A button that says 'Download on the App Store', and if clicked it will lead you to the iOS App store
A button that says 'Get it on, Google Play', and if clicked it will lead you to the Google Play store