Terraform Cloud Project Bootcamp with Andrew Brown — Week 2 Live Streaming (Multiverse of Madness Edition)

Gwen Leigh
7 min readOct 12, 2023

--

This article is part of my Terraform journey with Terraform Bootcamp by Andrew Brown and Andrew Bayko with Chris Williams and Shala.

My wild career adventure into Cloud Computing continues with Andrews, and I highly, highly recommend you to come join us if you are interested. Check out Andrews’ free youtube learning contents. You can also buy some paid courses here to support their good cause.

Agenda

Video here: Terraform Beginner Bootcamp Week-2

Issue # Goals

So what are we doing through this live stream? We practice using Terraform Cloud in three execution modes:

  • Run locally
  • Run remotely via Terraform Cloud
  • Run remotely via Terraform Cloud using VCS (Version Control System)

In Andrew’s words:

In week 2 live-stream we are just doing click-ops in AWS of what infrastructure we are setting up. It’s just to help people understand the resources we are creating via Terraform.

  • ✅ 1) Use two Execution Modes on Terraform Cloud: remote & local
  • ✅ 2) Make changes to for-local code to run it remotely via TC
  • ✅ 3) Repeat 1–2

Workflow

My diagram work was featured during livestream!!! 😍❤️❤️
🍯 🏡

📑 Notes

Current Week-2 work setup

When we run terraform locally (Gitpod workspace in case you are following our Andrews’ bootcamp), the following is how things work under the hood:

  • Terraform generates out the state file (terraform.tfstate) locally.
  • This state file will be gone once you destroy the Gitpod workspace.
  • To persist it, we can have Terraform Cloud manage the state file.
  • In this case, however, it is still Gitpod workspace that executes terraform plan & terraform apply.
  • In this setup, the advantage is that the execution is fast as Terraform Cloud doesn’t have to spin up its own containers with Compute environment for terraform.

1. Local execution test

We branch out to a new feature branch called terraform-cloud for this Week-2 live stream:

git checkout -b terraform-cloud

We are testing terraform with minimal AWS resource configuration. We comment out the following first:

  • Entire resource-cdn.tf in the modules/terrahouse_aws
  • the Condition inside bucket_policy in module/terrahouse_aws/resource-storage.tf
resource "aws_s3_bucket_policy" "bucket_policy" {
bucket = aws_s3_bucket.website_bucket.bucket
# policy = data.aws_iam_policy_document.allow_access_from_another_account.json
policy = jsonencode({
"Version" = "2012-10-17",
"Statement" = {
"Sid": "AllowCloudFrontServicePrincipalReadOnly",
"Effect": "Allow",
"Principal" = {
"Service": "cloudfront.amazonaws.com"
},
"Action": "s3:GetObject",
"Resource": "arn:aws:s3:::${aws_s3_bucket.website_bucket.id}/*",
# "Condition" = {
# "StringEquals" = {
# "AWS:SourceArn": "arn:aws:cloudfront::${data.aws_caller_identity.current.account_id}:distribution/${aws_cloudfront_distribution.s3_distribution.id}" # .current.arn
# }
# }
}
})
}

Then run:

terraform init
terraform plan
terraform apply
My comment is featured during the live stream! Andrew asked for +5 and my response is binary btw 😃

2. Remote execution on Terraform Cloud

Execution Mode: local & remote

When executing terraform locally, we have all the environment variables stored locally and our machine knows where to find them. However, once we manage our state via Terraform Cloud (TC), TC has to have them so it can execute terraform using its own machine (compute resources).

So we have to set variables on TC. You can find the Variables option on the left pane at the workspace-level settings. Note that the Variables option is dynamic, so it disappears once you set Execution Mode to local.

  • When we execute terraform locally, our local machine has the AWS credentials stored locally so we can safely provision resources on AWS cloud.
  • When we execute terraform remotely viaTerraform Cloud (TC), TC needs the credentials to access and provision AWS resources on AWS cloud as terraform runs through TC’s own compute envrionment.

Set environment variables on Terraform Cloud

  • Set environment variables for the terraform-cloud workspace on TC.
  • Add the resource-specific local variables as terraform variables on TC.
  • Then run terraform init & terraform plan

3. via Terraform Cloud with VCS

VCS is a Version Control System (such as GitHub, GitLab, or Bitbucket) that takes care of updates and changes to your code as your project develops. We can connect our Terraform Cloud to manage the version control and terraform runs in a more organised manner. Note that, once our workspace is linked to a VCS, we have to commit & push our updates to Github (that is, VCS). This will trigger Terraform Cloud, and then TC will run to reflect the changes to the infrastructure.

We are now connecting to version control:

  • 1) Workspace > Settings > Version Control
  • 2) Click Connect to version control > Select Version control workflow
  • 3) Then select Github from the top-down menu
  • 4) Select the repository (terraform-beginner-bootcamp-2023)

As for the rest of the settings, we would like to use the default configuration. Just know that there are multiple ways to configure the Remote Execution envrionment.

  • Apply Method: Auto apply & Manual apply
  • VCS Triggers
    - Always trigger runs
    - Only trigger runs when files in specified paths change

For visual guide, I list the screenshots of the configuration steps:

Now that we are ready with the remote run set up, we want to execute it. So we make some changes to the code and terraform apply. However, the run fails. Check out the status on your TC console.

What does this mean? It means :

Also, there are some errors in the variable configuration. These errors are not necessarily “errors” per se cause they are correct for local execution. However, the remote execution requires a bit different configuration so we have to adjust the code accordingly.

Andrew makes some changes to the code to fix it. Make sure that you commit & push to trigger Terraform Cloud to run:

resource "aws_s3_object" "index_html" {
bucket = aws_s3_bucket.website_bucket.bucket
key = "index.html"
- source = var.index_html_filepath
+ source = "${path.root}${var.index_html_filepath}"
content_type = "text/html"

- etag = filemd5("${path.root}/public/index.html")
+ etag = filemd5("${path.root}${var.index_html_filepath}")
}

resource "aws_s3_object" "error_html" {
bucket = aws_s3_bucket.website_bucket.bucket
key = "error.html"
- source = var.error_html_filepath
+ source = "${path.root}${var.error_html_filepath}"
content_type = "text/html"

- etag = filemd5("${path.root}/public/error.html")
+ etag = filemd5("${path.root}${var.error_html_filepath}")
}

However, the runs still fail. On the TC console, click on the specific run that failed to see the details.

In my case, the region information was missing. I added the information and the remote run succeeded.

provider "aws" {
+ region = "us-east-1"
}
git add provider.tf
git commit -m "Added region"
git push

To wrap up, we can destroy the infrastructure using the Console too.

Now this is the end of our hands-on live stream session to practice infrastructure provisioning executions using Terraform Cloud. From the next videos on, we will work back on the Terrahouses to launch them on the TerraTowns. See you in the next one! 👋

Chris’ insights (video)

  • In a production/enterprise environment, what happens in reality is that they don’t run terraform plan & terraform apply from their desktop or local workstation.
  • Their VCS (Version Control System like Github) is connected to Terraform Cloud, and every time they commit, this commit triggers terraform plan & terraform apply.
  • The change to your IaC is propagated through Github, through Terraform Cloud, then applies to my environment.

Good Fun Moments Gallary

One of my favourite moments during this stream. It was a good laugh :D

Resources

Bootcamp

--

--

Gwen Leigh

Cloud Engineer to be. Actively building my profile, network and experience in the cloud space. .