Using LocalStack and GitHub Actions to Test Terraform AWS Deployments

Robbie Douglas
5 min readJan 14, 2024

--

Terraform is great, but sometimes I want to make sure what I’m building with it is going to interact with what I have already deployed in a certain way or that it will be configured to meet certain compliance specifications. I might want to be able to do this without having to actually deploy the new resource, or I might want to be able to do these kinds of tests in a pipeline as a check during the deployment process. I also find myself needing to test code against resources, and I might want to be able to have local resources available for that as well. In this article I’m going to show how LocalStack can help with this.

LocalStack (LS) can create a local AWS environment which I can use Terraform (TF) to deploy resources into. LS has a free community edition and a paid service. The community edition has some limitations, there are less AWS resources that can be deployed with it and even the paid versions of LS don’t support everything in AWS; but the feature coverage even for the community edition is pretty impressive. See https://docs.localstack.cloud/user-guide/aws/feature-coverage/.

For a simple example of how to get this to work, I’ll create a LS environment, add an S3 bucket with TF and test the bucket by adding something to it. Then I’ll add something similar to a GitHub Actions (GHA) workflow to show how I can include a test in the pipeline to verify the configuration of the bucket before it is actually deployed.

First, I’m going to make a Python virtual environment to work in.

My requirements.txt looks like this.

boto3
localstack
awscli-local

And then I’ll run this.

python3 -m venv ./venv
source venv/bin/activate
pip install --upgrade pip
pip install -r requirements.txt

I’ll need Docker running on my system to get LS to start by running this.

localstack start -d

Then I see this.

I can see it is running with.

localstack status

Now looking at TF, LS provides information about configuring your TF provider. I’m mostly following what’s found in the Manual Configuration section here https://docs.localstack.cloud/user-guide/integrations/terraform/#manual-configuration.

With the provider setup, I’ll add a basic deployment to show how all this works together. My main.tf looks like this.

# main.tf

resource "aws_s3_bucket" "test-bucket" {
bucket = "my-bucket"
}

Then I can init, plan and apply.

I can see that my resource exists and use it locally with this.

awslocal s3 ls
awslocal s3 ls my-bucket
awslocal s3 cp ../README.md s3://my-bucket/
awslocal s3 ls my-bucket

I can interact with resources in LS in a very similar way to how I would with my AWS account. I have this test to use in the example with GHA later in this article, but it works well here to show how I can run code locally against LS resources.

I’ll create this class which I’ll use in my test.

import boto3
from botocore.exceptions import ClientError


class ClientS3(object):
def __init__(self):
self.session = boto3.session.Session(
aws_access_key_id='test',
aws_secret_access_key='test'
)
self.client = self.session.client(
's3',
endpoint_url="http://localhost:4566"
)
self.resource = self.session.client(
's3',
endpoint_url="http://localhost:4566"
)

def put_object(self, body, bucket, key):
response = self.client.put_object(
Body=body,
Bucket=bucket,
Key=key
)
return response

def upload_fileobj(self, file, bucket, key=None):
if not key:
key = "blah"
try:
self.client.upload_fileobj(
file,
Bucket=bucket,
Key=key
)
except ClientError as e:
print(f'Failed to upload file: {str(e)}')

def list_objects(self, bucket):
self.client.list_objects_v2(
Bucket=bucket
)

Notice the boto3.session and boto3.client setup in there. Those match what I have in my TF provider.

My test looks like this.

import unittest
from demo.demo import ClientS3


class TestClientS3(unittest.TestCase):
def test_put_object(self):
s3cl = ClientS3()
response = s3cl.put_object("blarg", "my-bucket", "file.txt")
response_status = response['ResponseMetadata']["HTTPStatusCode"]
self.assertEqual(response_status, 200)

if __name__ == "__main__":
unittest.main()

I’m not doing anything fancy here. Just trying to put an object in my bucket, but this shows that I can run tests like making sure my bucket is named as expected and that I can test other code that might use a bucket in its workflow.

Running this in a GHA workflow is pretty easy as well. I can run tests against LS in my GHA runner the same way I did locally. This can be really useful to verify configuration meets standards and compliance before actually deploying resources. My workflow looks like this which is doing everything I did above on the GHA runner.

name: Run Python Tests
on:
push:
branches:
- main
pull_request:
branches:
- main

jobs:
localstack:
runs-on: ubuntu-latest
services:
localstack:
image: localstack/localstack:latest
env:
SERVICES: ec2, s3
DEFAULT_REGION: us-east-1
AWS_ACCESS_KEY_ID: test
AWS_SECRET_ACCESS_KEY: test
ports:
- 4566:4566
- 4571:4571
steps:
- uses: actions/checkout@v2
- name: Setup Terraform
uses: hashicorp/setup-terraform@v1
- name: Terraform Format
id: fmt
run: terraform fmt -check
working-directory: ./terraform
- name: Terraform Init
id: init
run: terraform init
working-directory: ./terraform
- name: Terraform Plan
id: plan
if: github.event_name == 'pull_request'
run: terraform plan -no-color
working-directory: ./terraform
continue-on-error: true
- name: Terraform Plan Status
if: steps.plan.outcome == 'failure'
run: exit 1
- name: Terraform Apply
run: |
export TF_LOG=DEBUG
terraform apply -auto-approve
working-directory: ./terraform
- name: Install Python 3
uses: actions/setup-python@v1
with:
python-version: 3.8
- name: Install dependencies
working-directory: ./python
run: |
python -m pip install --upgrade pip
pip install -r requirements.txt
- name: Run tests with python unittest
working-directory: ./python
run: python -m unittest

Here’s my successful run.

If I want to show that the GHA workflow will catch a misconfiguration I can change my test to try to put an object in a different bucket that doesn’t exist like so.

class TestClientS3(unittest.TestCase):
def test_put_object(self):
s3cl = ClientS3()
response = s3cl.put_object("blarg", "bad-bucket", "file.txt")
...

And I get this from the workflow.

This was a simple example, but it shows how powerful LocalStack can be to work with and test AWS resources deployed with Terraform. I can take this and build out test cases for many resources and validate my code that deploys those resources and that might utilize those resources without ever having to put anything in my AWS account. Nice!

--

--