HashiCorp Vault Azure Secrets Engine — Secure Your Azure Resources

Sam Gabrail
Dec 17, 2020 · 6 min read

This post first appeared on TeKanAid’s blog.

Overview

In this blog post, we talk about the HashiCorp Vault Azure Secrets Engine. This is the first blog post in a new blog post series called End-to-End Infrastructure and Application Deployment.

The goal of this series is to learn best practices around the automation of infrastructure provisioning and application deployment.

We cover the concepts of Infrastructure as Code, CI/CD, secrets management, dynamic secrets, the secret zero problem, service mesh, and more. Our cloud of choice is Azure for this series. Our focus for this blog post is on the first step and that is to create a Vault Azure Secrets Engine. The purpose is to deliver Azure credentials dynamically for provisioning resources in Azure.

tl;dr you can find the code for this part, which is the Vault admin tasks configuration in the GCP infrastructure GitLab repo. You can also view the GitHub repo for this new blog series here. Moreover, below is a video explanation.

Vault Azure Secrets Engine — Part 1 of End-to-End Infra and App Deployment

Video Chapters

You can skip to the relevant chapters below:

  • 00:00 — Introduction
  • 01:22 — End-to-End Infrastructure and Application Deployment
  • 03:18 — Agenda
  • 03:48 — Overall Goal
  • 06:50 — Topics To Learn
  • 09:07 — Vault Azure Secrets Engine
  • 12:09 — Demo Starts
  • 13:00 — Terraform Config Walkthrough
  • 21:01 — Request Azure Creds from Vault
  • 25:24 — Conclusion

Pre-requisites

The following is required to follow along:

Overview of the End-to-End Infrastructure and Deployment Blog Series

Let’s take a look at what this blog series has to offer.

The Big Picture

Below is an overview diagram of this 4 part blog series.

Break-Up of the Blog Series
Break-Up of the Blog Series

We’ve broken up this blog series into 4 parts:

Part 1: HashiCorp Vault Azure Secrets Engine

This is the topic of this blog post and it’s really the first step to secure our pipeline. The purpose here is to create dynamic short-lived credentials for Azure. We will then use these credentials to provision the Jenkins VM and app VMs in Azure. The credentials are only valid for 1 day and they expire after that.

Part 2: HashiCorp Packer, Terraform, and Ansible to Set Up Jenkins

In this part, we use a few tools to build a Jenkins VM that will be used as our CI/CD pipeline. Below are the high-level steps:

  1. Packer to create an Azure image that has Docker installed.
  2. Create a Docker container image that contains Jenkins, Vault, Terraform, and Ansible.
  3. Use HashiCorp Vault to retrieve Azure credentials that have a 1 day TTL to use with Terraform
  4. Run Terraform to build a VM in Azure based on the Packer image that will host our Jenkins pipeline.
  5. Ansible then configures the Azure VM to:
  • Add necessary packages
  • Pull the Jenkins Docker image
  • Start the Jenkins container

Part 3: The Secret Zero Problem Solved for HashiCorp Vault

Here we discuss the secret zero problem and how to solve it. This is often referred to as Vault secure introduction. The issue is that we need to provide the Vault authentication token to our Jenkins pipeline and to our application. Once we have the token, then we can access secrets in Vault. The challenge is how to deliver this Vault token securely. We address secure introduction by using Vault AppRoles, response wrapping, and the Vault-agent.

Part 4: Jenkins, Vault, Terraform, Ansible, and Consul End-to-End CI/CD Pipeline

Finally, we put everything together in this part. Now that we have the Jenkins VM running and we’ve addressed the secret zero problem, we can finally run the pipeline to build our application. Below is the workflow:

  1. A developer commits and pushes code into GitHub
  2. The Jenkins pipeline automatically starts due to a webhook from GitHub to Jenkins
  3. Jenkins retrieves Azure credentials from Vault
  4. Jenkins runs Terraform with these credentials
  5. Terraform builds 3 VMs: a Consul server, the Python Webblog app server and a MongoDB server
  6. Terraform completes the provisioning and passes the 3 VMs’ fully qualified domain names (FQDNs) to Ansible
  7. Ansible configures these VMs to do the following:
  • Download and install the Consul and Envoy binaries for the service mesh
  • Pulls the MongoDB Docker image and starts the container
  • Downloads the Python dependencies for the Webblog app and starts the application

Some Tools Used in this Series

  • HashiCorp Packer
  • HashiCorp Terraform*
  • HashiCorp Vault*
  • HashiCorp Consul
  • Jenkins
  • Ansible
  • Microsoft Azure

*Featured in this post

Topics to Learn in this Blog Series

  1. Vault Azure Secrets Engine*
  2. Packer Images in Azure
  3. Terraform Building VMs in Azure based on Packer Images
  4. Ansible to Configure an Azure VM
  5. The Secret Zero Problem and Vault Secure Introduction
  6. Vault App Role
  7. Vault Dynamic Database Secrets for MongoDB
  8. Vault Transit Secrets Engine
  9. Advanced CI/CD Pipeline Workflow using: GitHub(VCS), Jenkins(CI/CD), Terraform(IaC), Ansible(Config Mgmt), Vault(Secrets Mgmt)
  10. Consul Service Mesh

*Featured in this post

Vault Azure Secrets Engine Explanation

Now let’s focus on part 1 of this series and discuss how to create dynamic Azure credentials using HashiCorp Vault. Take a look at the workflow diagram below:

Vault Azure Secrets Engine Diagram
Vault Azure Secrets Engine Diagram

There are 2 personas involved in this workflow:

  1. Vault admin
  2. DevOps engineer or an app

The Vault admin is responsible for the following:

  • Enabling the Azure secrets engine
  • Configuring the engine
  • Creating Vault roles

The DevOps engineer or the app is the consumer of the Azure secrets returned by Vault.

Next, we’ll take a look at the configuration that the Vault admin needs to create.

Azure Secrets Engine Terraform Configuration

A Vault admin is required to run the steps below. This can be done using the Vault CLI or API. My preferred way of doing it is via the Vault provider in Terraform. We are re-using our existing Vault cluster from our previous Webblog series. The Vault admin configuration is located in the main.tf file

We’ve included the relevant Terraform configuration for enabling the Azure secrets engine in Vault below:

resource "azurerm_resource_group" "myresourcegroup" {
name = "${var.prefix}-jenkins"
location = var.location
tags = local.common_tags
}
resource "vault_azure_secret_backend" "azure" {
subscription_id = var.subscription_id
tenant_id = var.tenant_id
client_secret = var.client_secret
client_id = var.client_id
}
resource "vault_azure_secret_backend_role" "jenkins" {
backend = vault_azure_secret_backend.azure.path
role = "jenkins"
ttl = "24h"
max_ttl = "48h"
azure_roles {
role_name = "Contributor"
scope = "/subscriptions/${var.subscription_id}/resourceGroups/${azurerm_resource_group.myresourcegroup.name}"
}
}

One thing to note is that the scope is tied to a resource group in Azure. This means that the credentials returned from Azure will allow a user to provision resources within that resource group only.

Retrieve Azure Credentials from Vault

Now that the Vault admin created the necessary configuration in Vault, a DevOps engineer or an app can create Azure credentials. This is done by running the command below after logging into Vault.

vault read azure/creds/jenkins

Your output would look something like this:

Key                Value
--- -----
lease_id azure/creds/jenkins/PH6H0V0COZdW6BQrWEbSflR2
lease_duration 24h
lease_renewable true
client_id 25280ec5-d598-4997-a323-387ead4bbfac
client_secret a9dfc2ae-582a-6020-0572-7b289cdf7c53

Once the credentials are retrieved, you can use them as Terraform variables to be used to provision the Jenkins VM and the application VMs that we will build later. We will see how this is done in the next 3 blog posts.

Note: This Vault policy is used with a token to run the configuration commands. We use the root token in this demo for simplicity, however, in a production setting it’s not recommended to use the root token.

Conclusion

In this blog post, we introduced the new End-to-End Infrastructure and Deployment blog series. We also talked about the first part of the series and that is to create dynamic secrets for Azure. These secrets are created using the HashiCorp Vault Azure secrets engine. They are dynamic in nature and have a TTL of 1 day. This enables a DevOps engineer to carry out the following steps:

  1. Log into Vault each day
  2. Retrieve new Azure credentials for the day
  3. Use these credentials in conjunction with Terraform to provision resources in a defined resource group in Azure
  4. Credentials expire automatically at the end of the day

This will also allow our Jenkins pipeline to securely provision the app VMs in an automated fashion. You’re now ready to move on to part 2 to setup Jenkins using Packer, Terraform, and Ansible.

References

Originally published on TeKanAid’s blog on December 17, 2020.

HashiCorp Solutions Engineering Blog

A Community Blog by the Solutions Engineers of HashiCorp…

Medium is an open platform where 170 million readers come to find insightful and dynamic thinking. Here, expert and undiscovered voices alike dive into the heart of any topic and bring new ideas to the surface. Learn more

Follow the writers, publications, and topics that matter to you, and you’ll see them on your homepage and in your inbox. Explore

If you have a story to tell, knowledge to share, or a perspective to offer — welcome home. It’s easy and free to post your thinking on any topic. Write on Medium

Get the Medium app

A button that says 'Download on the App Store', and if clicked it will lead you to the iOS App store
A button that says 'Get it on, Google Play', and if clicked it will lead you to the Google Play store