Running AWS SAM in a Docker Container

Michael Andrews
Monsoon Engineering
4 min readNov 12, 2018

By Michael Andrews and Chris Ramón

At Monsoon, many of our Cloud Native services are deployed using AWS Lambda. While developing and deploying a serverless Node.js application in AWS Lambda is fairly straightforward, you might run into trouble building and testing your application with AWS Lambda locally.

Recently, Amazon has released it’s “AWS Serverless Application Model” aka AWS SAM, which lets developers run their AWS Lambda function locally before deployment. Unfortunately, AWS SAM is packaged outside of the standard AWS Command Line Interface, and must be installed using a Python package manager (pip) outside of our regular Node.js toolchain. This isn’t a huge issue for local development, but when it comes to also building and testing your application in a Continuous Integration/Continuous Deployment (CI/CD) service, like AWS CodeBuild, this can become a pain point.

Thankfully, there is a simple and well-known technique to encapsulate your build and test environment so that it functions the same locally as it does in CI/CD— bundling your build and test toolchain in a Docker container! The only problem with this approach — under the hood SAM itself also runs as a Docker container, and getting SAM to run in a parent Docker container turns out to be non-trivial.

In this article, we will outline how to build and run your AWS Lambda using Docker and SAM.

A Basic TypeScript Serverless Application

Let’s start by defining our Node.js function. Our example function converts the JSON input it receives into an XML document using xml2json. This is a fairly synthetic example, but one that was motivated by a real world use case where we wanted to parse XML with a native library. In this example we will use TypeScript — a JavaScript inspired language with static type checking.

Typically, in order to compile TypeScript into JavaScript we first need to add TypeScript as a dependency:

… and then use NPM to install and run the TypeScript compiler:

npm install
npm run tsc

However, since our goal is run the final AWS Lambda function in a Dockerized version of SAM, we will also need to install the dependencies in a similar container so that any native code can be executed properly.

Building your TypeScript Application in a Container

Compiling a TypesScript Application in an AWS Lambda compatible Container

First, we will need to build our application in an environment that closely resembles AWS Lambda. To accomplish this, we can leverage “Docker Lambda” — a Docker image that includes the AWS Lambda build tools and dependencies. To use this image, we will need to mount our source code so that the container has access to it, and then run the TypeScript tool chain within the container:

docker run --rm -v "$PWD":/var/task lambci/lambda:build-nodejs8.10 sh -c 'npm install && npm run tsc'

Now all of our dependencies and native code can be accessed from our AWS Lambda runtime Container!

Running your TypeScript Application in a Container

Running a TypesScript Application in Dockerized SAM

Now that we have a package that can be executed by SAM, we can attempt running SAM in a Docker container. At the moment, there aren’t any public Docker images that include the aws-sam-cli, though its pretty easy to define an image using the Alpine Linux Python image:

Since the ENTRYPOINT is a bit complex, we’ve encapsulated it in it’s own shell script:

As you can see, the main arguments to our SAM entrypoint script will be the AWS SAM template file and the Docker volume with our source code. Additionally, we can specify a network so that our AWS Lambda function can connect to other external resources running in our Docker environment, for example a database. NB: The argument to skip the image pull ( — skip-pull-image) should make the entrypoint execute faster if the underlying Docker daemon already has a cached version of the AWS Lambda runtime image.

To bring this all together we can use Docker Compose to specify our local source code as a Docker volume mount point:

NB: Notice that we are also passing the Docker socket along as a mounted file. This is the key to running SAM in a Dockerized environment, as SAM itself will spawn a Docker container. The mounted and bound Docker socket effectively allows the SAM container to spawn alongside the parent container as a sibling instead of a child. This technique also allows us to specify our source code in the local working directory ($PWD), and have this directory forwarded along as the remote directory mounted by the SAM container (docker-volume-basedir). Finally, we can start our AWS Lambda function in a container, with all native dependencies compiled for AWS Lambda!

docker-compose up sam_app

Now all we need to do is test that our application is functioning properly, and we are good to deploy:

Conclusion

As you can see, although deploying a serverless Node.js application is pretty straightforward, it turns out that building and testing the application across platforms can be tricky. We hope you found this article to be illuminating! All of the sample code referenced in the article can be found in this GitHub repository. Next up- we plan to detail configuring AWS CodeBuild to automatically build, test and deploy our serverless Node.js Application!

--

--