TensorIoT
Published in

TensorIoT

Integration Tests w/ DynamoDB Local

By: Kenny Matsudo & Nicholas Burden

Overview:

In this blog, we’ll explore the process of setting up a local Amazon DynamoDB instance for both development and testing. We can use this local DynamoDB instance to develop locally without connecting directly to Amazon DynamoDB, and perform our integrations tests without incurring normal Amazon DynamoDB costs.

Note: These test examples only showcase the possibilities of locally run Amazon DynamoDB and do not represent actual production tests.

We’ll be using:

Setting Up Our Project

First, we need to set up our repository. Although this step is not necessary for working locally, we’ll need this in the later part of this blog when we start integrating our Amazon DynamoDB integration test in a pipeline. In this example, we’ll be using CodeCommit as our repository.

Using the CLI, we can quickly create a repository using:

aws codecommit create-repository — repository-name <REPO_NAME>

In our example, we’ll be creating a repository named ‘localdynamodb’.

We also need a local project folder with some initial resources. Let’s pull from the above repository to save time.

git clone https://git-codecommit.us-east-1.amazonaws.com/v1/repos/localdynamodb

> cd localdynamodb
> touch template.yml buildspec.yml .gitignore
> mkdir src tests

Template.yml will be used to store our Amazon CloudFormation template for deployment via CodeDeploy. Buildspec.yml will be used to give our CodeBuild commands to run. Src will contain our lambdas. Tests will contain our unit and integration tests.

We can add a couple lines into our .gitignore so that we don’t push any unnecessary code.

.DS_Store
*.aws-sam
*.pytest*
*pycache*
*dynamodb_local*

Installing DynamoDB Local

We can install Amazon DynamoDB locally anywhere on our host system, but in order to run this instance in CodeBuild, we’ll place the binary files inside our tests folder. To successfully run AmazonDynamoDB locally, we’ll also need the java SDK installed.

Furthermore, we can place necessary command into a .bash file inside our /tests/folder so that we don’t need the dynamodb-local binaries in our repository, and easily run this script in our buidlspec.yml file in the future.

/tests/dynamo_build.bash

#! /bin/bash
mkdir ./tests/dynamodb_local
cd ./tests/dynamodb_local

curl -O “https://s3-us-west-2.amazonaws.com/dynamodb-local/dynamodb_local_latest.tar.gz"
tar -xvzf dynamodb_local_latest.tar.gz; rm dynamodb_local_latest.tar.gz

java -jar DynamoDBLocal.jar &

This script will:

  1. Create a dynamodb_local folder and change directories into that folder.
  2. Download the dynamodb local tar file.
  3. Extract the far file binaries.
  4. Run the jar file (starts the local dynamodb instance) using the amperstand “&” to run it in the background.
  5. Sleep (to allow the jar file to complete executing)
  6. Create the DynamoDB table.

After executing this script using ./tests/dynamo_build.sh from the root directory, a local DynamoDB instance will be running on http://localhost:8000.

Local Development

We’ll be using a subset of this AWS example that builds an Amazon DynamoDB table for modeling game player data. This example only uses the Amazon DynamoDB table schema for the purposes of showing simple integration tests.

To start local development, we create our initial table via the AWS CLI by using the command:

aws dynamodb create-table — table-name TestTable — attribute-definitions AttributeName=pk,AttributeType=S AttributeName=sk,AttributeType=S — key-schema AttributeName=pk,KeyType=HASH AttributeName=sk,KeyType=RANGE — billing-mode PAY_PER_REQUEST — endpoint-url http://localhost:8000 &

IMPORTANT: Make note of the –endpoint-url flag, this points to our local Amazon DynamoDB instance rather than an Amazon DynamoDB table on our AWS account.

Now we can start writing lambda functions that integrate with our local DynamoDB table:

touch src/utility.py

import json
import boto3

dynamodb = boto3.resource(‘dynamodb’,endpoint_url=”http://localhost:8000")
table = dynamodb.Table(‘TestTable’)

def create_user(username, metadata):
‘’’ Creates a user in our table. ‘’’

data = {
“pk”: “USER:” + username,
“sk”: “METADATA:” + username
}

data.update(metadata)

return table.put_item(
Item=data
)

def create_game(game_id, metadata):
‘’’ Creates a game in our table. ‘’’

data = {
“pk”: “GAME:” + game_id,
“sk”: “METADATA:” + game_id
}

data.update(metadata)

return table.put_item(
Item=data
)

def user_game_mapping(game_id, username):
‘’’ Creates a mapping between a game and a user. ‘’’

data = {
“pk”: “GAME:” + game_id,
“sk”: “USER:” + username
}

return table.put_item(
Item=data
)

This utility file will include simple functions that write and read from the Amazon DynamoDB table. We’ll be writing our test cases using these functions, ensuring that these work properly. IMPORTANT: Make note of the endpoint_url parameter, this points to our local Amazon DynamoDB instance rather than an AmazonDynamoDB table on our AWS account.

Now, we’ll write our tests.

touch tests/test_utility.py

tests/utility.py

import json
import os

import boto3
import pytest
import src.utility as util

ddb_client = boto3.client(‘dynamodb’, endpoint_url=’http://localhost:8000')
ddb = boto3.resource(‘dynamodb’, endpoint_url=’http://localhost:8000')
GAME_TABLE = os.environ.get(‘GAME_TABLE’)

class TestUtility():

table = None

@classmethod
def setup_class(cls):
‘’’ Creates the test table to be used in tests. ‘’’
ddb.create_table(
TableName=GAME_TABLE,
AttributeDefinitions=[
{‘AttributeName’: ‘pk’,
‘AttributeType’: ‘S’},
{‘AttributeName’: ‘sk’,
‘AttributeType’: ‘S’}
],
KeySchema=[
{‘AttributeName’: ‘pk’,
‘KeyType’: ‘HASH’},
{‘AttributeName’: ‘sk’,
‘KeyType’: ‘RANGE’}
],
BillingMode=”PAY_PER_REQUEST”
)

TestUtility.table = ddb.Table(GAME_TABLE)

@classmethod
def teardown_class(cls):
‘’’ Creates the test table to be used in tests. ‘’’
ddb_client.delete_table(TableName=GAME_TABLE)

@classmethod
def cleanup(cls):
‘’’ Deletes all the items in the Dynamodb table. ‘’’
all_items = TestUtility.table.scan()[‘Items’]
with TestUtility.table.batch_writer() as batch:
for item in all_items:
batch.delete_item(
Key={
‘pk’: item[‘pk’],
‘sk’: item[‘sk’]
}
)

def test_create_user(self):
‘’’ Should successfully create a user in the DynamoDB table. ‘’’
test_user = {
“username”: “testuser”,
“metadata”: {
“address”: “123 W. 456 St. Irvine, CA, 92614”
}
}

response = util.create_user(**test_user)
assert response[‘ResponseMetadata’][‘HTTPStatusCode’] == 200

response = TestUtility.table.get_item(Key={
“pk”: “USER:testuser”,
“sk”: “METADATA:testuser”
})

assert ‘Item’ in response

TestUtility.cleanup()

def test_create_game(self):
‘’’ Should successfully create a game in the DynamoDB table. ‘’’

data = {
“game_id”: “b5176e3085ea056ecc2c5dbae78c161a”,
“metadata”: {
“address”: “GameOfThrones”
}
}

response = util.create_game(**data)
assert response[‘ResponseMetadata’][‘HTTPStatusCode’] == 200

response = TestUtility.table.get_item(Key={
“pk”: “GAME:b5176e3085ea056ecc2c5dbae78c161a”,
“sk”: “METADATA:b5176e3085ea056ecc2c5dbae78c161a”
})

assert ‘Item’ in response

TestUtility.cleanup()

def test_user_game_mapping(self):
‘’’ Should successfully create a mapping between user and game. ‘’’

data = {
“game_id”: “b5176e3085ea056ecc2c5dbae78c161a”,
“username”: “testuser”
}

response = util.user_game_mapping(**data)
assert response[‘ResponseMetadata’][‘HTTPStatusCode’] == 200

response = TestUtility.table.get_item(Key={
“pk”: “GAME:b5176e3085ea056ecc2c5dbae78c161a”,
“sk”: “USER:testuser”
})

assert ‘Item’ in response

TestUtility.cleanup()

We’re creating our table during the class setup phase. If you are using the same table name during local development (using the aws dynamodb create-table command), this will cause an issue.

By this point, we should have 1) a local instance of DynamoDB running, 2) our source code that we’ll be testing, and 3) our test file.

We can ahead and run pytest from our project root.

pytest -s
================================================================================================== test session starts ==================================================================================================
platform darwin — Python 3.7.7, pytest-5.4.1, py-1.8.1, pluggy-0.13.1
rootdir: /Users/kennymatsudo/Dropbox/blogs/localdynamodb
collected 3 items

tests/test_utility.py …

=================================================================================================== 3 passed in 0.58s ===================================================================================================

Our tests ran successfully on a local instance of Amazon DynamoDB.

CICD Integration

It’s a good idea to have integration tests with Amazon DynamoDB run locally as well as within a CICD pipeline. We’ll quickly set up a simple pipeline using the repository we created above. This simple pipeline should not be used in live production.

CodeCommit -> Push To Master -> Trigger Build -> Deploy Local DB Instance -> Run Integraiton Tests -> Deploy Resources via CloudFormation (if tests succeed).

We’ll be using a modified AWS sample CloudFormation template in order to create a simple pipeline:

AWSTemplateFormatVersion: “2010–09–09”
Description: CI/CD Pipeline for DynamoDB Integration Testing.
Parameters:
ProjectName:
Description: The name of the project.
Type: String
Image:
Description: The Image you wish to use for CodeBuild.
Type: String
Default: aws/codebuild/standard:4.0
ComputeType:
Description: The Compute Type to use for AWS CodeBuild
Type: String
Default: “BUILD_GENERAL1_SMALL”
RepoName:
Type: String
PipelineBuildSpec:
Type: String
Default: buildspec.yml

Resources:
CodePipelineRole:
Type: “AWS::IAM::Role”
Properties:
AssumeRolePolicyDocument:
Version: “2012–10–17”
Statement:
— Effect: “Allow”
Principal:
Service:
— “codepipeline.amazonaws.com”
Action:
— “sts:AssumeRole”
Policies:
— PolicyName: “AWS-CodePipeline-Service-Policy”
PolicyDocument:
Version: “2012–10–17”
Statement:
— Effect: “Allow”
Action:
— “codedeploy:*”
— “s3:*”
— “iam:PassRole”
— “codebuild:*”
— “codecommit:*”
— “cloudformation:*”
Resource: “*”
CloudFormationRole:
Type: AWS::IAM::Role
Properties:
AssumeRolePolicyDocument:
Version: “2012–10–17”
Statement:
Effect: Allow
Principal:
Service: cloudformation.amazonaws.com
Action: sts:AssumeRole
Policies:
— PolicyName: “AWS-CloudFormation-Service-Policy”
PolicyDocument:
Version: “2012–10–17”
Statement:
— Effect: “Allow”
Action:
— “s3:*”
— “logs:*”
— “cloudwatch:*”
— “dynamodb:*”
— “iam:*”
— “lambda:*”
— “apigateway:*”
— “codepipeline:*”
— “codecommit:*”
— “codedeploy:*”
— “cloudformation:*”
Resource: “*”
CodeBuildRole:
Type: AWS::IAM::Role
Properties:
AssumeRolePolicyDocument:
Version: “2012–10–17”
Statement:
Effect: Allow
Principal:
Service: codebuild.amazonaws.com
Action: sts:AssumeRole
Policies:
— PolicyName: “AWS-CodeBuild-Service-Policy”
PolicyDocument:
Version: “2012–10–17”
Statement:
— Effect: “Allow”
Action:
— “s3:Put*”
— “s3:Get*”
— “logs:*”
— “cloudformation:*”
— “codecommit:*”
Resource: “*”

PipelineBucket:
DeletionPolicy: Retain
Type: AWS::S3::Bucket

CodeBuildProject:
Type: AWS::CodeBuild::Project
Properties:
Artifacts:
Type: CODEPIPELINE
Environment:
ComputeType: !Ref ComputeType
Image: !Ref Image
Type: LINUX_CONTAINER
Name: !Sub “${ProjectName}”
ServiceRole: !GetAtt CodeBuildRole.Arn
Source:
BuildSpec: !Ref PipelineBuildSpec
Type: CODEPIPELINE

CodePipeline:
Type: AWS::CodePipeline::Pipeline
Properties:
ArtifactStore:
Type: S3
Location: !Ref PipelineBucket
RoleArn: !GetAtt CodePipelineRole.Arn
Name: !Ref ProjectName
Stages:
— Name: CodeCommit
Actions:
— Name: TemplateSource
ActionTypeId:
Category: Source
Owner: AWS
Version: 1
Provider: CodeCommit
OutputArtifacts:
— Name: “TemplateSource”
Configuration:
BranchName: “master”
RepositoryName: !Ref RepoName
RunOrder: 1
— Name: Build
Actions:
— Name: Validation
ActionTypeId:
Category: Build
Owner: AWS
Version: 1
Provider: CodeBuild
OutputArtifacts:
— Name: “BuildOutput”
InputArtifacts:
— Name: “TemplateSource”
Configuration:
ProjectName: !Ref CodeBuildProject
RunOrder: 1
— Name: Deploy
Actions:
— Name: Deploy
ActionTypeId:
Category: Deploy
Owner: AWS
Version: 1
Provider: CloudFormation
InputArtifacts:
— Name: BuildOutput
Configuration:
ActionMode: CREATE_UPDATE
StackName: !Sub “${ProjectName}-stack”
ChangeSetName: !Sub “${ProjectName}-stack”
TemplatePath: “BuildOutput::outputTemplate.yml”
RoleArn: !GetAtt CloudFormationRole.Arn
Capabilities: CAPABILITY_AUTO_EXPAND,CAPABILITY_IAM
RunOrder: 1

#! /bin/bash
sam deploy -t pipeline.yml — stack-name localdynamo-pipeline — capabilities CAPABILITY_NAMED_IAM — parameter-overrides \
ParameterKey=ProjectName,ParameterValue=localdynamo \
ParameterKey=RepoName,ParameterValue=localdynamo

This template creates the CodePipeline and looks for the buildspec.yml file we created at the beginning of this project. Note that the pipeline will run and fail on the initial deployment since we did not yet populate the buildspec.yml.

We also need to create a simple CloudFormation template to deploy the Amazon Lambda we created and an Amazon DynamoDB table.

Transform: AWS::Serverless-2016–10–31

Globals:
Function:
Timeout: 60
Runtime: python3.7

Resources:
LambdaRole:
Type: AWS::IAM::Role
Properties:
AssumeRolePolicyDocument:
Version: “2012–10–17”
Statement:
— Effect: Allow
Principal:
Service:
— lambda.amazonaws.com
Action:
— “sts:AssumeRole”
ManagedPolicyArns:
— arn:aws:iam::aws:policy/CloudWatchFullAccess
— arn:aws:iam::aws:policy/AmazonDynamoDBFullAccess

LambdaFunction:
Type: AWS::Serverless::Function
Properties:
CodeUri: src/
Handler: utility.lambda_handler
Role: !GetAtt LambdaRole.Arn
Environment:
Variables:
GAME_TABLE: !Ref GameTable

GameTable:
Type: AWS::DynamoDB::Table
Properties:
AttributeDefinitions:
— AttributeName: pk
AttributeType: S
— AttributeName: sk
AttributeType: S
KeySchema:
— AttributeName: pk
KeyType: HASH
— AttributeName: sk
KeyType: RANGE
ProvisionedThroughput:
ReadCapacityUnits: 5
WriteCapacityUnits: 5

DynamoDB Integration Tests w/ CodeBuild

Now we need to edit our buildspec.yml file to give CodeBuild instructions. We’ll need an S3 bucket to store the artifacts that CloudFormation uses (we’ll use the bucket created via our CodePipeline template).

buildspec.yml

version: 0.2

env:
variables:
IS_LOCAL: “true”
ARTIFACT_BUCKET: “localdynamodb-pipeline-pipelinebucket-16pkk76t2iehg”
GAME_TABLE: “TEST_GAME_TABLE”

phases:
install:
runtime-versions:
python: 3.7
commands:
— echo “Installing python dependencies…”
— pip3 install — upgrade aws-sam-cli pytest
— echo “Installing DynamoDB locally”
— chmod +x ./tests/dynamo_build.bash
— ./tests/dynamo_build.bash
pre_build:
commands:
— echo “Running DynamoDB Integration tests…”
— pytest -s

build:
commands:
— echo “Building assets…”
— sam build -t template.yml

post_build:
commands:
— echo “Packaging assets…”
— sam package — output-template-file outputTemplate.yml — s3-bucket $ARTIFACT_BUCKET — s3-prefix deployment

artifacts:
type: zip
files:
— outputTemplate.yml

This buildspec will set the env variable “IS_LOCAL” true so that our source code knows to use the local Amazon DynamoDB endpoint. We also install aws-sam-cli and pytest as dependencies to run our tests and deployments.

We’ll be running the script we wrote earlier that handles deployment of our local Amazon DynamoDB instance. With all these pieces in place, we can then run pytest to run our integration tests.

Upon merging or committing our code to the master branch, our pipeline will trigger and run our CodeBuild buildspec file. This will spin up our local Amazon dynamodb instance and perform our pytests against that instance.

Initializing DynamoDB Local with the following configuration:

Port: 8000

InMemory: false

DbPath: null

SharedDb: false

shouldDelayTransientStatuses: false

CorsParams: *

[Container] 2020/03/21 04:24:13 Phase complete: INSTALL State: SUCCEEDED

[Container] 2020/03/21 04:24:13 Phase context status code: Message:

[Container] 2020/03/21 04:24:13 Entering phase PRE_BUILD

[Container] 2020/03/21 04:24:13 Running command echo “Running DynamoDB Integration tests…”

Running DynamoDB Integration tests…

[Container] 2020/03/21 04:24:13 Running command pytest -s

============================= test session starts ==============================

platform linux — Python 3.7.6, pytest-5.4.1, py-1.8.1, pluggy-0.13.1

rootdir: /codebuild/output/src093693060/src

collected 3 items

tests/test_utility.py …

============================== 3 passed in 1.39s ===============================

Summary

In this blog, we created a simple scenario showcasing the ability to use a local Amazon DynamoDB instance in an AWS project using Python. Using a local Amazon DynamoDB instance provides numerous possibilities, allowing developers to develop offline, save Amazon DynamoDB related costs like storage and throughput, and perform integration tests.

--

--

Get the Medium app

A button that says 'Download on the App Store', and if clicked it will lead you to the iOS App store
A button that says 'Get it on, Google Play', and if clicked it will lead you to the Google Play store