Setting up local testing for AWS Lambda with GitHub Actions

Brian Seel
cylussec
Published in
4 min readDec 26, 2022

Bringing together AWS Lambda, DynamoDB, Docker, Tox, Python, Pytest and GitHub Actions for true integration testing

The first step I always like to take with a new coding project is to get some simple tests setup. That is especially true when I am using a technology that is new to me. So when I started a new project that would be the first time I used AWS Lambda and AWS DynamoDB, I knew I wanted to learn how to use them through integration tests.

Starting DynamoDB in Docker

Fortunately, there is a Docker image that has the most up to date version of DynamoDB. I used this as my docker-compose.yml file.

version: '3.8'
services:
dynamodb-local:
command:
"-jar DynamoDBLocal.jar -sharedDb -dbPath ."
image: "amazon/dynamodb-local:latest"
container_name: dynamodb-local
ports:
- "8000:8000"
volumes:
- .aws/:/home/dynamodblocal/.aws:ro
working_dir: /home/dynamodblocal

I tested that by running docker compose -f docker-compose.yml up --force-recreate --wait and saw the container come up.

Writing Tests

As a Python developer, my go to testing framework is Pytest. I want to start with a simple test just to make sure my test suite is working, and to make sure I understand how to use the boto3 AWS library. I wrote the following as my Python code that creates a table.

import boto3

TABLE_DEF = {
'KeySchema': [
{
'AttributeName': 'trip_id',
'KeyType': 'HASH'
},
{
'AttributeName': 'trip_start_date',
'KeyType': 'RANGE'
},
],
'AttributeDefinitions': [
{
'AttributeName': 'trip_id',
'AttributeType': 'S'
}, {
'AttributeName': 'trip_start_date',
'AttributeType': 'S'
},
],
'ProvisionedThroughput': {
'ReadCapacityUnits': 5,
'WriteCapacityUnits': 5
}}


def create_table(dynamodb_resource: boto3.resources.base.ServiceResource, table_name: str):
"""
Code to create the vehicle-stops table

:param dynamodb_resource: a resource from boto3.resource()
:param table_name: name of the table to create
"""
dynamodb_resource.create_table(TableName=table_name, **TABLE_DEF)

All it does is create the table with the help of the dynamodb_resource, which is a fixture defined in conftest.py. This is a very simple test suite, but it will only work if everything in our test suite works.

Then I define the fixtures.

import boto3
from pytest import fixture

DYNAMODB_PORT = 8000
REGION_NAME = "us-east-1"


@fixture(scope="session")
def dynamodb_resource():
"""boto3 resource to the local DynamoDB instance started by tox"""
return boto3.resource('dynamodb', endpoint_url=f"http://localhost:{DYNAMODB_PORT}", region_name=REGION_NAME)

And finally, the actual test.

from mta_tracker import handler


def test_create_table(dynamodb_resource, dynamodb_client):
table_name = 'test-table'

assert table_name not in dynamodb_client.list_tables()['TableNames']
handler.create_table(dynamodb_client, table_name)
assert table_name in dynamodb_client.list_tables()['TableNames']

We just setup a test that is going to look for a local instance of DynamoDB running, and create a table. Now lets use Tox to start up the Docker image, and start the tests.

Tox

Tox is a very powerful tool for running repeatable test suites, and other tools (such as linters). In this case, we are going to add another step in our Tox file that starts the Docker container, and runs the tests.

[testenv:integration]
deps = pytest
commands =
docker compose -f {toxinidir}/docker-compose.yml up --force-recreate --wait
py.test {toxinidir}/tests
allowlist_externals = docker

There is a project that better integrates Docker and Tox, but it seems to only support Linux (or having a colon in your path confuses it, so I assume it doesn’t support Windows). This step runs the Docker compose command we ran manually, and then runs the tests. Right now it runs all of them, but we can use Pytest markers to have it only run integration tests later.

Right now, we should be able to see this working by running tox -e integration.

GitHub Actions

Finally, lets set this up so it runs all of this every time we check in our code, so we can get a clean red or green light, showing if our tests are passing or not. I added the following file as .github/workflows/tests.yml to setup the GitHub Actions.

name: MTA Tracker post commit steps
on: [push]
jobs:
build:
runs-on: ubuntu-latest
steps:
- name: Checkout
uses: actions/checkout@v2
- name: Setup Python
uses: actions/setup-python@v2
- name: Tox
run: |
pip install tox
tox -e integration

At this point, we can check in our code to GitHub, and it should automatically start the tests. Mine passed:

I use PyScaffold to create my projects, but I created the following files in the following structure.

.github/
├─ workflows/
│ ├─ tests.yml
src/
├─ mta_tracker/
│ ├─ __init__.py
│ ├─ handler.py
tests/
├─ conftest.py
├─ test_lambda.py
docker-compose.yml
tox.ini

--

--

Brian Seel
cylussec

Software developer; resident of Baltimore; love trying new things