Using Serverless Framework & Localstack to test your AWS applications locally

Antonio Reyes
Nov 4 · 4 min read

Introduction

The first challenge I had to overcome when I arrived at ManoMano was to think about a way to test our AWS applications locally. Until that moment, developers should deploy their code to an integration environment to be able to test it. Obviously it costed a lot of time, money and frustration until the code was ready to be deployed to production.

The company was starting undergo a huge transformation from a monolithic application to micro-services architecture, so my first objective was to identify a good “candidate” to get my POC started. I soon realized that there was a team who was working on an AWS application which met all the requirements I needed, so it was time to start working! 🙌

Going deeper to the application itself, it is built with Serverless framework and based on some lambda functions to execute all the tasks needed after shopping (mainly the generation of invoices, but also some extra task related to).

It was a really great challenge for me because it was the first time I had to deal with this kind of technologies (thanks to my teammate Isaac D. for his support and knowledge during this adventure). So this article is based on that context and it pretends to be an easy guide to everyone who could be in a similar situation and wants to have a working local environment for their AWS application.

Initial Setup 🔧

First, you should have installed the following tools:

  1. Install Docker if you haven’t already.
  2. Install Serverless framework.
  3. Install the AWS CLI. Although we aren’t going to work with “real” AWS, we’ll need it to talk with our local docker containers.
  4. Once the AWS CLI is installed, run aws configure to create some credentials. You can use real credentials (as described here), or dummy ones. Localstack requires that these values are present, but it doesn’t actually validate them.
  5. (Optional) AwsLocal: A thin wrapper around the aws command line interface for use with LocalStack.
  6. Serverless-localstack plugin: https://github.com/localstack/serverless-localstack.
  7. Create some files inside your project’s folder.

touch docker-compose.yml && mkdir .localstack

Configuring Serverless Localstack Plugin ☁️

There are two ways to configure the plugin, via a JSON file or via serverless.yml. You can check the official documentation for a customized configuration. I have done it via serverless.yml file:

serverless.yml

service: myServiceplugins:
- serverless-localstack
custom:
localstack:
stages:
# list of stages for which the plugin should be enabled
- local
host: http://localhost # optional - LocalStack host to connect to
autostart: true # optional - start LocalStack in Docker on Serverless deploy
endpoints:
# This section is optional - can be used for customizing the target endpoints
S3: http://localhost:4572
DynamoDB: http://localhost:4570
CloudFormation: http://localhost:4581
Elasticsearch: http://localhost:4571
ES: http://localhost:4578
SNS: http://localhost:4575
SQS: http://localhost:4576
Lambda: http://localhost:4574
Kinesis: http://localhost:4568
lambda:
# Enable this flag to improve performance
mountCode: True
docker:
# Enable this flag to run "docker ..." commands as sudo
sudo: False
stages:
local:
...

Configuring Localstack Docker 🐳

You can run Localstack directly from the command line (cloning the repo), but I like using Docker because you don’t need to worry about downloading Localstack on your system. Here’s the config:

docker-compose.yml

version: '2.1'services:
localstack:
image: localstack/localstack
ports:
- "4567-4597:4567-4597"
- "${PORT_WEB_UI-8080}:${PORT_WEB_UI-8080}"
environment:
- SERVICES=${SERVICES- }
- DEBUG=${DEBUG- }
- DATA_DIR=${DATA_DIR- }
- PORT_WEB_UI=${PORT_WEB_UI- }
- LAMBDA_EXECUTOR=${LAMBDA_EXECUTOR- }
- KINESIS_ERROR_PROBABILITY=${KINESIS_ERROR_PROBABILITY- }
- DOCKER_HOST=unix:///var/run/docker.sock
volumes:
- "${TMPDIR:-/tmp/localstack}:/tmp/localstack"
- "/var/run/docker.sock:/var/run/docker.sock"

Running Localstack 🔥

Now that we have our docker-compose.yml in good shape, we can spin up the container: docker-compose up -d.

You can now access different AWS services through different ports on your local server. For the purposes of this tutorial, we only care about S3, found at localhost:4572. When you enter this URL in our browser, you should see something like the following:

Creating AWS resources 🏗

The next step would be to create the AWS resources you need to your application. We’ll create a bucket in S3 as example. You can use the following command to create a bucket:

aws — endpoint-url=http://localhost:4572 s3 mb s3://tutorial

When using the aws command, the endpoint-url argument is required to specify that you want to create a bucket in your LocalStack instance. However, if we installed awslocal previously we can do the same without specifying the endpoint url each time.

awslocal s3 mb s3://tutorial

After the command runs, a new bucket named “tutorial” is created and ready to use. You can check it going to the URL: http://localhost:4572/tutorial

Deploying our serverless application 🚀

Once we have all our AWS resources created, we can deploy our serverless aplication locally with the following command:

sls deploy —- stage local

Invoking locally a lambda function 💫

Finally, once we have our stack created, we will be able to invoke any lamba function locally with:

sls invoke -f functionName --stage local

This is just the starting point… 🚩

But I hope this article helps you to test your AWS applications locally!

What do you think about testing cloud applications locally? Are you using any tool like LocalStack? How do you handle it? Please share your thoughts below in the comments section! 😊 🙏

Bonus track 🎁

You can get a working example in the following repository: https://github.com/antreyes/localstack-demo

Follow me! 👍

Manomano Tech

Behind the scenes : we share stories about our product, our data science & our engineering lives

Antonio Reyes

Written by

QA Lead @CapitoleConsulting | ManoMano https://www.linkedin.com/in/antonio-reyes-garcia | Tech enthusiast 💻 and always thinking on quality 🤔💡GitHub: antreyes

Manomano Tech

Behind the scenes : we share stories about our product, our data science & our engineering lives

Welcome to a place where words matter. On Medium, smart voices and original ideas take center stage - with no ads in sight. Watch
Follow all the topics you care about, and we’ll deliver the best stories for you to your homepage and inbox. Explore
Get unlimited access to the best stories on Medium — and support writers while you’re at it. Just $5/month. Upgrade