MultiCloud DevOps with AWS, Azure and Terraform Part II: Angular Frontend with Terraform and AWS
Introduction
This is part II of my blog series about building a Multicloud Application with Terraform. As already described in Part I (see here) I will use Terraform. The introduction to Terraform can be seen in the first story.
Introduction to Frontend Application
For the Frontend, I used Visual Studio Code to create a simple application that will leverage the endpoints of the Backend. Since the build of the Frontend is not the main concern here I will not go into any detail. You can see the source code of the application here (Link to Github)
After the setup of our Tooling for Terraform (see first part) we need to create a Service Connection to AWS in the same manner we created the Service Principal in the Azure Environment.
Setting up Access Key and Access Secret for AWS Environment
Prerequisites:
- An active AWS Account
- The required Permissions to create Access Keys and Access Secrets
This Keyset is used in the same manner as the Service Principal in Azure. It is needed by Terraform to be able to create the requested resources.
Creation of the Keyset in AWS
First, log in to AWS and click in the top right corner on your AWS Tenant Name.
After that go to “My Security Credentials” and follow the Tab to “Access keys”. Here you are able to create a new Access Key.
You can either copy the Access Key ID and the Secret Access Key or download your Key File.
Creating Terraform module files
As the configuration of our Access Key and the Terraform CLI is now finished we can go on and start to create our Terraform module files. We will create submodules in our already existing module structure.
$ tree Terraform
.
|-- LICENSE
|-- main.tf
|-- outputs.tf
|-- README.md
|-- vars.tf
|-- modules/
| |-- azure-webapp-with-plan/
| | |-- README.md
| | |-- vars.tf
| | |-- main.tf
| | |-- outputs.tf
| | |-- LICENSE
| |-- azure-mssql-database-with-server/
| | |-- README.md
| | |-- vars.tf
| | |-- main.tf
| | |-- outputs.tf
| | |-- LICENSE
| |-- aws-s3-static-website/
| | |-- README.md
| | |-- vars.tf
| | |-- main.tf
| | |-- outputs.tf
| | |-- LICENSE
In the submodule, we will create an S3 Bucket and an S3 Bucket policy to make the bucket publicly available.
We also create the corresponding vars.tf file. This includes also the Service Provider in Terraform for AWS
To reference these files in our main.tf in the main module we need to add a few lines of code here. In the main file, we need to link the newly created submodule by adding the following.
Also, we need to add the required and used variables to our vars.tf file.
Now we have our Terraform file in place and can go on to adding the required parts to our build pipeline.
Additions to Azure Pipeline
Prerequisites:
- Azure DevOps for your Account (the free version will be ok)
- AWS Toolkit for Azure DevOps installed (link)
- File Patch Build and Release Task installed (link)
Step 1: Additions to our Build pipeline
As mentioned in the last story I like to use different “Run on Agent” tabs for different purposes. Our purpose here is to build the Angular application.
Configuration of Agent for Frontend Build
The following steps are used:
Task npm: Install npm dependencies
Since we get a “new” Agent every time we start the pipeline we need to install the npm dependencies required by our project.
- Command:
install
- Working folder that contains package.json:
<Your folder path to frontend Application>
Task npm: Build Angular Web Project
With this task, we will create the Web Project of the Angular Application. This will generate our files needed for our deployment.
- Command:
custom
- Working folder that contains package.json:
<Your folder path to frontend Application>
- Command and arguments:
run build
Task Publish build artifacts: Publish Artifact: frontendsoftware
This task will publish our build application to an artifact which will then be accessible in our Release pipeline.
- Path to publish:
<Your folder path to frontend Application>/dist/<Frontend Application Name>
- Artifact name:
frontendsoftware
- Artifact publish location:
Azure Pipelines
Step 2: Additions to Release Pipeline
In the last story, we already created a Release Pipeline with some artifacts. We will now do so also for the frontend Artifact we created in our Build Pipeline.
In the picture above you can see all artifacts and in addition the frontendsoftware artifact.
Since we only added to our Terraform module files we can just add the needed variables for our AWS module in the Infrastructure Agent in the Task terraform apply. Just add the variables as described in the last story.
The last addition we have to make is to add an upload task for our Angular files for the AWS S3 Bucket and patch our appsettings.json file with the corresponding values.
Task Patch JSON Files: Patch files appsettings.json
In this task, we will patch our appsettings.json file for the configuration in the Angular application with the backend Url. This has to be done so that the Angular frontend can communicate with the correct backend.
- Syntax Type:
Slick Syntax
- Patch working directory:
<Your directory path to the appsettings.json>
- Target files:
appsettings.json
- Patch Content:
= /apiUrl => "<Your Url for your Backend>"
Task Amazon S3 Upload: Upload Angular Dist Folder
In this task, we will upload the required files for our Angular application. For this, to work we need a Service Connection to AWS. We can reuse our Access Key and Access Secret we created in the first steps of this story. To create the Service Connection click on the “+ New” Button and past the required information.
- Bucket Name:
$(site_name)
- Source Folder:
<Your path to the Source folder of the Angular build>
- Filename Patterns:
*
- Access Control (ACL):
public read
After running the pipeline we will be able to see our deployed frontend Application. When we try this application we can create blog entries that will be saved in our Azure Backend Application.
Conclusion
As you can see the additional work to be done to add another Cloud Provider with resources to our existing Terraform modules is very easy. A lot of these steps are very similar when you would only try to deploy to on Cloud Provider. As already mentioned in the first story: No Production Infrastructure should ever be managed by manual configuration. Infrastructure as Code is key to a reliable and manageable Cloud Infrastructure.
I hope again you could follow me on that topic. If you have questions feel free to send me feedback or leave a clap. ;-)