AWS Services integration testing by leveraging Local Stack and GitHub Actions

Edward Romero
3 min readAug 4, 2020

--

AWS services are used across many applications nowadays, so it is imperative that we make sure our applications have well thought out integration tests with these services prior to deployment. Also, it is necessary that these integration tests happen as part of CI/CD to find bugs earlier rather than later as well as improving development velocity.

What are we building?

We are creating an integration CI workflow for testing applications that integrate with AWS services. We leverage Local Stack and GitHub Actions to automate integration testing during CI workflows.

Fig1. Shows a high-level diagram of a GitHub workflow describing integration testing of AWS services leveraging local stack within GitHub Actions

Our setup will trigger a GitHub Workflow every time a developer pushes their code to GitHub. The workflow will stand up a Local Stack service leveraging GitHub Workflow services. Then, our Jest integration tests will create the necessary AWS resources and run simple tests against them.

Main Resources

  • GitHub Actions: Host our CI pipeline for running our integration tests.
  • Local Stack: A fully functional local AWS cloud stack.

Other Resources

These resources can be replaced for any other ones depending on your project. The main part of the tutorial is to showcase the main resources integration:

  • Jest: JavaScript testing framework
  • Aws Javascript SDK
  • Dynogels: DynamoDB data mapper for node.js. (Only used for the purposes of this tutorial but the DynamoDb connection setup can be done through any DynamoDb ORM or AWS SDK.)

Now let’s get to the important part…

The Code

Walking Through The Code

  1. Workflow Environment variables
    AWS_HOST: We use this env to make sure our application knows the service host created. In this case, our host created is localstack .
    SERVICES: A list of services to stand up for integration testing. In our case, we are only testing dynamodb, s3, and sqs. But there’s a bunch more that you can test with. Take a look at the Local Stack docs for specific services that it supports.
  2. Service Image
    We utilize the light version of local stack localstack/localstack . If you need other services that this light version doesn’t come with, then you can use the full version by replacing this image with localstack/localstack-full . Take a look at the local stack documentation for more information.
  3. Ports
    Local Stack has created a new way of accessing resources. It now exposes one port 4566 for all resources it creates. This makes it very easy to configure our AWS config.
  4. Service Environment variables
    We mapped our workflow environment variables SERVICES and AWS_HOST to the services environment variable SERVICES and HOSTNAME_EXTERNAL respectively, to make sure that they are able to be accessed.
    HOSTNAME_EXTERNAL: Used to properly set the host URL that local stack will utilize on generated URLs when creating AWS resources with the SDK. If we don’t map this properly to the HOSTNAME of the service we created, then the URLs generated will have localhost as the hostname.
    e.g Default URL Created when using default hostname http://localhost:4566/000000000000/forest-local and this is how it gets generated when we modified the Default URL to use our AWS_HOST environment variable http://localstack:4566/000000000000/forest-local
  5. Service Options
    We set up the normal health parameters required by services in GitHub actions. One option you should take a closer look at is --health-cmd We assign a curl request to localhost:4566 and on failure we return error code. The idea here is that we want to make sure local stack has made this port available and ready to be consumed by our application.

Project Code

The example project is hosted in GitHub, so feel free to take a look. Getting your hands dirty with the code is the best way to learn this new integration.

https://github.com/meroware/example-aws-services-github-worflows

Conclusion

The CI pipeline has made it faster and easier to test new code that integrates with AWS services. Now we don’t have to worry about having to keep track of a unified local setup for integration testing of AWS services with our applications. Have some fun sharing this knowledge with your peers 😁.

Happy Coding!!!

--

--