Repeatable Builds using YAML in Azure Pipelines

Christopher Woolum
Dec 3, 2018 · 4 min read

I’ve been using VSTS as my main ALM tool for years and I’ve generally been happy with the the way it has been laid out. It’s pretty easy to drop tasks into the editor and configure them and service connections make it easy to connect to external resources.

Where I really felt like the tooling was lacking was when we started developing MicroServices. We were creating 2–3 new build definitions a week and when we tested them, there were a lot of pieces missing that would take a bit of time to debug.

Anyone in DevOps will tell you that repeatability is king and our process was nothing like that. You might be thinking, “Hey Chris, just use task groups!”. Well, we did try that but in my opinion, they are too loosely defined to work for a large number of builds and the UI doesn’t provide a lot of support for managing/deprecating old versions of them.

In the last few years, I also have worked a bit with TravisCI and Jenkins so I was familiar with the configuration as code approach so I was extremely excited when Azure Pipelines announced that they were going to support YAML.

For my personal projects, I have two different types of builds; Core libraries and API’s in the form of Docker containers. They probably could be consolidated but this is a first pass and I was just trying to get a feel for how these builds work.

I’m going to just go over my docker build in this article because I feel that all my builds will most likely end up running in a similar fashion. Let me start by pasting my common template first and I’ll go into each of the parts.

As you can see in the template, most of the work is actually done by a simple bash task using the Docker CLI. This is exactly what the documentation recommends and I really like this approach because when you need help from the community, you’re using the same approach as people with other DevOps tools and it’s easier to find solutions.

This common build does a few things. The only parameter is a collection of services which are the things that need to be built. Each object it iterates through has 3 properties defined; the tagName, version, and the dockerfileName. The build will first authenticate with the Azure ACR so that all of the following docker commands will work. After login occurs, each service is built and pushed to the ACR. Finally, all of the Kubernetes tamplates are saved as artifacts to be used in my release later. The additional docker pull commands are for caching previously built images from the ACR. If you want more info on this, check out this article by Andrew Locke. We had previously used Docker Compose to manage building all of our images in the solution but it seemed silly to have that dependency just for that reason.

I have a separate repository that is used to store the common templates. You can tag them, branch them, etc. so that you have very granular control over how your templates are rolled out.

Now, let’s take a look at the project specific implementation of this build.

Configuration here is super simple. At the top, we reference the repository where the common build templates are located. In our jobs, we reference the template by name and pass in our collection of services.

The best part about this is that as the solution grows, developers can update the build to cover more projects while the core pieces defined by DevOps are still maintained. This takes a lot of load off of DevOps and they can focus on larger initiatives instead of making small changes to build configuration on as each project grows. Say that a new important feature like code security scanning needs to occur during every build; Now this change can be rolled out to the common build template with absolutely no change to the other build definitions.

The logs also end up being dynamic as well based on the compiled template. You can see here that we have two Build, tag, and push image steps because of the parameters defined above. This makes it very easy to debug and understand your builds.

I’m very happy with where these builds are so far and am very excited that Microsoft has included this are part of Azure Pipelines. Next, I’m probably going to improve this further by adding in unit and functional test runs.

I hope this helps anyone having trouble managing their microservices builds! Let me know if you have any questions or if I missed anything.

135