MultiCloud DevOps with AWS, Azure and Terraform Part I: .NET Core Backend and SQL Database in Azure with Terraform and Azure DevOps

Philipp Klautke
MAKONIS
Published in
10 min readMay 12, 2020

Introduction

In this blog series I want to describe how to build the DevOps behind a Multicloud Application with Terraform. The main goal is to show how Terraform in combination with Azure DevOps (my personal favourite DevOps tooling) can help to create Multicloud Applications. The first part of this series will describe the deployment and the modularization within Terraform for a basic .NET Core Web Application with a database on Azure. The second part will look into the resource creation for static website hosting for an Angular Application on AWS and the integration of this to the first created DevOps pipelines (see here).

Introduction to Backend Application

For the backend I used Visual Studio to scaffold a simple Project with Dependency Injection and Entity Framework Core and a simple Controller to be able to create, delete and modify our blog posts. It is used to manage Blog Entries with a REST API. Here you can create, delete and modify blogs. It also includes Swagger so we can see these Endpoints directly. Since the building of this application is not the main concern of this blog post we will consider this as given. You can view the source code for this application here (Link to GitHub)

Introduction to Terraform

Terraform is a toolset to create and provision infrastructure. It does so by describing Infrastructure as Code with a high-level configuration syntax. Since it is a tool to create Infrastructure as Code the infrastructure concerns can be versioned and reused. In this context, a Terraform Module can be reused multiple times in different environments.

Setting up the tooling to work with Terraform

To be able to plan and apply your Terraform Modules we need to install the required binaries to our path variable. I used chocolatey as a Windows Package Manager. (To install chocolatey visit https://chocolatey.org/install)

choco install terraform

Setting up Service Principal in Azure Environment

Prerequisites:

  • An Azure account with an active subscription (Free tier will do)
  • The required permissions to create Service Principals in you Azure Tenant

The Service Principal is used by Terraform so that Terraform is authorized in your Azure Tenant to create the requested Resources.

Creation of the Service Principal with Azure CLI

az login
az account set --subscription="<your subscription id>"
az ad sp create-for-rbac --role="Contributor" --scopes="subscriptions/<your subscription id>

You can install the Azure CLI following the steps here

Now we have to take note of the following to use in further steps:

appId     --> Service Principal Client Id
password --> Service Principal Client Secret
tenant --> Tenant Id

Creating Terraform module files

Now we can start to work on our Terraform module files. We will create a main module which will reference sub modules. These submodules can be reused and for example saved in a Shared Module Repository. Our final folder structure will be as follows:

$ tree Terraform
.
|-- LICENSE
|-- main.tf
|-- outputs.tf
|-- README.md
|-- vars.tf
|-- modules/
| |-- azure-webapp-with-plan/
| | |-- README.md
| | |-- vars.tf
| | |-- main.tf
| | |-- outputs.tf
| | |-- LICENSE
| |-- azure-mssql-database-with-server/
| | |-- README.md
| | |-- vars.tf
| | |-- main.tf
| | |-- outputs.tf
| | |-- LICENSE

In the main folder there are 2 most important files main.tf and vars.tf

In the modules folder we have our modules. One module is for creating an Azure Web App with a corresponding service plan. The other module is a module to create an MS-SQL Azure Database with Database Server.

Created modules can be checked in to a shared repository and used later by other developers. For infrastructure which is used frequently, this can save a lot of time. A module therefore should contain everything to be also able to run as standalone.

./modules/azure-webapp-with-plan/main.tf
./modules/azure-webapp-with-plan/vars.tf

Within this module we first create a Resource Group. Within this Resource Group we create a Linux App Service Plan and within this App Service Plan an Web App, which we can use later. To get this all working we only need a few variables as seen in vars.tf

./modules/azure-mssql-database-with-server/main.tf
./modules/azure-mssql-database-with-server/vars.tf

In this module we also create a Resource Group first, since Azure is heavily based on the concept of resource groups. After that we can create the Database Server, a Firewall Rule for the Server and the Database itself.

To use these two modules in conjunction we have a main.tf and a vars.tf in the main folder. Here we tie the two just created modules together and pass all needed variables.

./main.tf
./vars.tf

Creating Azure Pipeline

Prerequisites:

  • Azure DevOps for your Account (the free version will be ok)
  • AWS Toolkit for Azure DevOps installed (Link)
  • File Patch Build and Release Tasks installed (Link)

Step 1: Creating a Build Pipeline for .NET Core Backend

In Azure pipelines we have the possibility to structure our pipelines with “Run on Agent” tabs. I personally like this structure since the pipeline has some more structure than just a random set of pipeline tasks after each other.

Configuration of Agent for Backend Build

.NET Core Backend Build Tasks

The following steps are used:

Task .NET Core: Build

This task is used to build our project.

  • Command: build
  • Path to project(s): **/*.csproj
  • Arguments: --configuration $(BuildConfiguration)
  • Pipeline Variables: BuildConfiguration: release

Task .NET Core: Publish API

This task is used to publish all files to the Artifact Staging Directory needed to run the backend.

  • Command: publish
  • Path to project(s): **/BlogApi.csproj
  • Arguments: --configuration $(BuildConfiguration) --output $(build.artifactsstagingdirectory)
  • Tick Zip Published Projects
  • Tick Add project's folder name to publish path

Task Archive files: Zip Source

This task is used to be able to do an entity framework migration later on in the release pipeline.

  • Root folder or file to archive: TestBlog
  • Archive type: zip
  • Archive file to create: $(Build.ArtifactStagingDirectory)/source.zip

Task Publish build artifacts: Publish Artifact: backend software

This task is now to be used to publish our artifact for the software of the backend so that we can use it in a release pipeline

  • Path to publish: $(Build.ArtifactStagingDirectory)
  • Artifact name: backendsoftware
  • Artifact publishes location: Azure Pipelines

Task Publish build artifacts: Publish Artifact: source

This task is now to be used to publish our source zip to be able to do entity framework migrations in a release pipeline

  • Path to publish: $(Build.ArtifactStagingDirectory)/source.zip
  • Artifact name: source
  • Artifact publish location: Azure Pipelines

Configuration of Agent for Terraform files

Creating Artifact for Terraform files

Task Publish build artifacts: Publish Artifact: terraform

This task is used to pack up our terraform files for further handling in our release pipeline.

  • Path to publish: Terraform
  • Artifact name: terraform
  • Artifact publish location: Azure Pipelines

Step 2: Creating the Release Pipeline Part

First, we will create a new Release Pipeline with the following artifacts.

As you can see we imported 3 Artifacts. This can be done by selecting “+Add” and configuring the settings for each artifact we published in the previous build pipeline

  • Source (build pipeline): <your build pipeline name>
  • Default version: latest
  • Source Alias: <artifact name from build pipeline>

For the Artifacts we can configure with the Lightning-Symbol the Continuous deployment trigger. This will start the pipeline every time a new artifact is available. Since our artifacts are all generated by one pipeline we should only set on artifact here to be triggered by the Continuous deployment trigger.

In the Stages section, we add a new, empty Stage and give it the title Prod. For our showcase here we only add one stage to show the principle. The stage itself has a very similar look and feel like the build pipeline. Here I also recommend using different agents to structure your pipeline. In the following I will show the configuration of the three Agents I included.

Configuration of Agent for Infrastructure

The infrastructure part here is to invoke our Terraform modules with the right configuration. First we have to install the Terraform tooling on the Agent. After that, as also on our local machine, we need to initialize terraform. Finally we will apply our Terraform Modules to create our resources

Task Terraform tool installer: Install Terraform 0.12.24

  • Version: 0.12.24

Task Terraform CLI: terraform init

In this task we need to configure some advanced parts. Terraform saves its state on a local machine always in the folder were you execute terraform. Since in the Release Pipeline we will always get a “fresh” agent for every run we need to configure some storage to save the terraform state. In this example I save this information in an Azure Storage Account I already created. Also to work with this task we need to have a so-called Service Connection. A service connection is a connection with other tools. In this case we need a Service Connection to our Azure Subscription. This can be easily created by clicking on the small “Manage” link.

  • Command: init
  • Configuration Directory: $(System.DefaultWorkingDirectory)/<path to your artifact with the terraform files>
  • Backend Type: azurerm
  • Backend Azure Subscription: <Name of your Azure Subscriptions Service Connection>
  • Resource Group Name: <Name of the Resource Group where the Storage is deployed to>
  • Storage Account Name: <Name of the Storage Account>
  • Container Name: <Name of the Container within the Storage Account>
  • Key: <Name of the Key to save the State info in>

Task Terraform CLI: terraform apply

In this task we will finally apply our Terraform Modules. For this to work we want to pass some variables to the Terraform CLI. This is done by adding in the Command Options section for every variable -var="<varname>=<varvalue>”. To add multiple variables just delimit these by one space. Additionally a Service Connection is also used to save and check the state of the Terraform Environment.

  • Command: apply
  • Configuration Directory: $(System.DefaultWorkingDirectory)/<path to your artifact with the terraform files>
  • Environment Azure Subscription: <Name of your Azure Subscriptions Serivce connection>
  • Command Options: <Variables as explained above>

Configuration of Agent for Database Migration

In this agent we will create a dedicated database user for our application and run an Entity Framework Migration.

Task Azure SQL Database deployment: Create Database User

Since we created our database server and our database in Azure we can use this task. A Service Connection is required since the firewall settings for this Task to work will be created and removed automatically for this deployment to work. The most Pipeline Variables in this task are reused from the variables we used to create the resources.

  • Azure Subscription: <Your Azure Service Connections Name>
  • Authentication Type: SQL Server Authentication
  • Azure SQL Server: $(databaseServerName).database.windows.net
  • Database: $(databaseName)
  • Login: $(SQLAdminLoginName)
  • Password: $(SQLAdminLoginPassword)
  • Deploy type: Inline SQL Script
  • Inline SQL Script:
IF NOT EXISTS(SELECT principal_id FROM sys.database_principals WHERE name = '$(DatabaseUserName)') BEGIN
CREATE USER $(DatabaseUserName) WITH PASSWORD = '$(DatabaseUserPassword)'
EXEC sp_addrolemember 'db_owner','$(DatabaseUserName)'
END
GO
  • Pipeline Variables:
databaseServerName: <Database Server Name as in Infrastructure>
databaseName: <Database Name as in Infrastructure>
SQLAdminLoginName: <SQL Admin Login as in Infrastructure>
SQLAdminLoginPassword: <SQL Admin Password as in Infrastructure>
DatabaseUserName: <Your application Database User Name>
DatabaseUserPassword: <Your application Database User Password>

Task Extract Files: Extract Files

Here we need to extract the source.zip to be able to do the Entity Framework migrations.

  • Archive file patterns: **/source.zip
  • Destination folder: $(System.DefaultWorkingDirectory)/sourcefiles

Task Patch JSON Files: Path file appsettings.json

In this Task we will override the Database Connection String in our appsettings.json file.

  • Syntax Type: Slick Syntax
  • Patch working directory: $(System.DefaultWorkingDirectory)/sourcefiles/TestBlog/BlogApi
  • Target files: appsettings.json
  • Patch Content:
= /ConnectionStrings/BlogDatabase => "Server=tcp:$(databaseServerName).database.windows.net,1433;Initial Catalog=$(databaseName);Persist Security Info=False;User ID=$(DatabaseUserName);Password=$(DatabaseUserPassword);MultipleActiveResultSets=False;Encrypt=True;TrustServerCertificate=False;Connection Timeout=30;"

Task Azure CLI: Azure CLI

Here we will invoke the actual Entity Framework Migration

  • Azure Subscription: <Your Azure Serivce Connection Name>
  • Script Location: Inline script
  • Inline Script:
dotnet tool install -g dotnet-ef --version 3.1.0
dotnet ef database update --project "<path to csproj File for Project including Migrations within source.zip>" --startup-project "<path to csproj File for Startup Project within source.zip>" --verbose
  • Configuration of Agent for Deployment of Files

In this agent we finally deploy our source code to the created resources.

Task Azure App Service deploy: Deploy Files to Azure App Service

  • Azure Subscription: <Your Azure Service Connection>
  • App Service Type: Web App on Windows
  • App Service name: <your appname from Infrastructure>
  • Package of folder: <Path to the Software Zip file>

Task Azure App Service Settings: Set Connection String in Azure Web App

  • Azure Subscription: <Your Azure Service Connection>
  • App Service name: <your appname from Infrastructure>
  • Resource group: <your Resource Group name from Infrastructure>
  • Connection Strings:
[
{
"name":"BlogDatabase", "value":"Server=tcp:$(databaseServerName).database.windows.net,1433;Initial Catalog=$(databaseName);Persist Security Info=False;User ID=$(DatabaseUserName);Password=$(DatabaseUserPassword);MultipleActiveResultSets=False;Encrypt=True;TrustServerCertificate=False;Connection Timeout=30;",
"type":"SQLAzure"
}
]

Conclusion

After running the pipeline you should now be able to visit the swagger of your newly deployed Web Application on Azure with Terraform Modules. As you can see there is a simple way to configure Azure with Terraform without even touching the Azure Portal. This is by far the most valuable takeaway. There should be no production environment in the cloud where the resources are manually created or configured. This will lead you to hours and hours of search for the mini misconfiguration error. Also with Terraform we have tooling which can help us to version our infrastructure as we do with our codebase.

I hope you could follow me on that topic. If you have questions feel free to send me feedback or leave a clap. ;-)

--

--

Philipp Klautke
MAKONIS
Writer for

Senior Cloud Software Engineer @ MAKONIS GmbH