Local Build Agents for Azure DevOps

Colin Domoney
7 min readMay 13, 2019

Why you would even want to do this, and the things that will trip you up

A primer in Azure DevOps build agent technology

Azure DevOps Pipelines is a cloud-hosted technology to allow CI (continuous integration) and CD (continuous deployment) software tasks to be executed in a coordinated fashion (using, for example, YAML configuration) when triggered by events in the lifecycle of the application (for example a successful merge request). The heavy lifting of executing the pipeline tasks such as compilation, packaging, etc. is performed by a build agent (also known as a runner).

Since the execution of a build agent task incurs compute cycles there is a hosting cost associated with build agent activity and this is one of the restrictions placed by Microsoft on concurrent execution; on the free version of Azure DevOps, only one concurrent build agent task may execute.

Internally on Azure DevOps, the build agents are decoupled from the core Azure DevOps fabric by an (undocumented) API. This allows build agents to be hosted internally on Azure DevOps itself (the built-in Hosted agent pools), on other cloud compute resource or indeed on local compute instances such as a VM or a bare-metal machine.

Benefits of a local build agent

The immediate question arises as to why you would want to run your own build agent and simply not pay for addition concurrent Microsoft Azure DevOps agents. The initial consideration is one of cost: it may be that you are able to provide compute resource at a lower cost than that bundled by Azure DevOps and you simply want to use Azure DevOps Pipelines as the orchestrator and you will supply the compute power. That’s perfectly fine, all you have to do is purchase additional agent pool counts via Azure DevOps.

The other two reasons relate to performance and are the reasons I initially investigated the local build agent capability.

  • Queuing time: My observation was that frequently I received a message about waiting for agent pool allocation while waiting for a build to commence despite not having any other tasks executing. This is a simple consequence of using a pool resource and not having a guaranteed allocation of agents. Although the delays were never lengthy (up to a minute or two) they were an annoyance when trying to accomplish rapid turnarounds on debug cycles (my YAML sometimes lets me down).
  • (Lack of) Caching: A key tenet of building robust pipelines is that of zero assumptions about the pipeline environment (the so-called hermetic pipeline); all pre-requisites and tools should be installed as part of the pipeline setup process. This is perfectly understandable since on Azure DevOps Pipelines a build agent is provision on-demand for a job and is not dedicated to a particular pipeline, project or customer. However, this results in a large overhead of seemingly repetitive installation of tools. Having a dedicated local agent allows for the caching of tools since the agent context persists between invocations (the agent tool cache is stored to the local agent working folder). For large installations (like a kernel compilation) such a cache can result in significant time savings.

These two advantages, although seemingly negligible, can produce significantly faster turnarounds on iterative builds; and are covered in more detail in the FAQ on the Azure DevOps build agent documentation.

Installing the local build agent

The installation of the built agent is very well documented and proved to be trivial on both Ubuntu 18.04LTS and MacOS Mojave 10.14.4. The process is documented in here for Ubuntu and here for MacOS. The only minor issues I had were with the correct folder ownership for the agent folders and assigning the agent to the Docker group to allow usage of Docker.

I assigned the agents to the ‘Default’ pool (I could have perhaps created a custom pool) and on the ‘Agent Pool’ tab was immediately able to see the installed agent and the fact that this was online and ready for scheduling.

On the Ubuntu host, I installed the agent as a service to ensure that it would automatically start on reboot; on my Mac I simply run the service manually on demand.

Installing custom tools on your agent

Whilst the installation of the agent was a simple process the next step proved particularly troublesome to resolve; namely the installation of toolchains onto the agents. By default, a fresh install of an agent contains no tools at all and can only offer those of the host operating system. Microsoft has sought to address this problem by providing a feature called ‘tool installers’ which allows various tool dependencies to be installed on demand from within a pipeline task; these are documented here and available on GitHub here.

The use of the tool installer is simple to use as shown below; here a requirement on Node 11.13 is explicitly inserted as the first step in the pipeline.

Installing a specific Node

As soon as the pipeline is invoked a local tools cache is created in a subfolder on the agent named _work/_tool. Examing the folder shows the existence of a Node installed folder as shown:

On subsequent executions, the tool installer will use the cached tool without re-installing it.

Challenges with Python in particular

Although the tools installer capability of Azure DevOps Pipelines allowed me to install an appropriate version of Node easily enough the same was not true when I came to install Python. I specified a requirement on Python 3.6.5 as shown below; however, this was never installed in the same way that Node was earlier.

Installing a specific Python

It appeared that the tool installer behaves differently for Python than for Node which was somewhat frustrating. A search of Google revealed I was not the only person to have struggled with this topic.

I decided to install Python from source directly into the _tool folder using instructions here; the important detail is to specify the--prefix option to avoid overwriting a machine installation! Doing an installation resulted in the output within the _work/_tool folder below:

Tool folder initial install

I tested the installation from the install folder on the agent and confirmed that version 3.6.5 was invoked as expected. However when I attempted to use said version in a pipeline (by using -python -V ) I was disappointed to find that the Python invoked was the system Python 2.7 — certainly not the desired outcome!

Some searching and debugging revealed another little missing trick: it is necessary to create a ‘marker’ folder called ‘x64.complete’ in the root of the Python folder as shown below:

Tool folder with the marker folder

After creating the directory I could confirm that python3 -V reported version 3.6.5; however, python -V still reported 2.7. The solution was simple — in the bin folder create a symlink from python to the python3 executable as shown:

Symlink to python3

Finally testing this within a pipeline as shown reported version 3.6.5. as expected. Job done!

Testing the Python version

One footnote on building Python from source: on MacOS Mojave I was unable to get a successful build despite much hacking and tweaking and Googling. I gave up building from source and simply ‘cloned’ my Python folder from my Brew cellar location. Although crude this worked perfectly.

The further possibilities of running locally

One unintended consequence of running an agent locally is that given sufficient permissions it is possible for the build agent to access local system resources. In my case, I wanted to use a USB serial port programmer to re-Flash a Node MCU image onto an ESP8266 board and I was able to use a Bash script from within a pipeline task to directly access the programmer and perform the operation. Quite a neat little abuse of a so-called cloud-hosted solution!

What next for build agents?

This article has hopefully inspired you to go out and set up your own local build agents so you can leverage the benefits of low-latency builds (minimal queuing) and cached environments (avoid unnecessary re-installation of tools). There are some obstacles to be overcome to get a fully working solution but once you know how it’s a simple process.

Microsoft is in the process of further simplifying the task of installing build agents by releasing publically the container images which they themselves host on the Azure platform so rather than needing to install an agent on bare-metal it is possible to run a container. Further information on this can be found Docker Hub and on GitHub.

Learn even more

If you’d like to learn more about Azure DevOps and specifically the security aspects thereof please be sure to view my recent Veracode webinar on the topic of “Securing the Sugar out of Azure DevOps”.

The other top resource for finding out more about all things Azure is the fantastic cloud advocates; they’re global and cover all aspects of getting going with Azure.

Azure

--

--

Colin Domoney

DevSecOps ‘whisperer’, currently helping organisations undergo their transformation to DevOps in a secure manner.