How to Deploy a Local Serverless Application With AWS SAM

Get started with Serverless, Labda, and Docker

Lorenz Vanthillo
Mar 20 · 8 min read
Photo by Craig Lovelidge on Unsplash

SAM is an open-source framework that you can use to build, test and deploy serverless applications on AWS.

The main goal of this article is to familiarize you with AWS SAM so you can test your lambdas on your local machine without the need to (re)deploy them every time on AWS. I’ll discuss different integrations and examples with SAM and lambda.

SAM uses Docker to build and run lambdas locally, so a basic knowledge of Docker is required.


Let’s say there’s a company with a lot of documents. These documents have a and a . There are also multiple versions of one document.

The employees of the company find it difficult to locate specific versions of documents. The company has developed a solution to make it easy to request the storage location of a document based on its and .

Let’s translate the use case to a technical architecture in AWS. We will use a DynamoDB to store the , and storage location of a document. We need to populate the DynamoDB table with our existing document information and then set up lambda proxy integration in API Gateway. An end-user will be able to retrieve the correct storage location of a document by performing a GET request with the correct parameters ( and ) to the API Gateway endpoint.

End-user will use the static website to perform a GET request which looks like this:

We can use SAM to deploy this stack in AWS, but one of the strengths of SAM is that it offers a straightforward way to test your integrations locally.

We will use SAM local to:

  • Invoke a lambda once using SAM CLI to populate DynamoDB.
  • Host our local API Gateway endpoint with lambda proxy integration.
  • Generate a sample event that API Gateway sends to our lambda function when someone searches for a document.

When everything works locally we will deploy the full stack in AWS.

Initial Setup

To make it possible to follow this demo you’ll need the following prerequisites:

We will use Docker to run our local DynamoDB instance — but not just that.

SAM is also highly dependent on Docker. When a local lambda function is invoked by SAM CLI, SAM will start a Docker container, execute the lambda code and destroy the container.

Let’s first create a Docker bridge network. We’ll use this type of Docker network so that our Docker containers can communicate with each other by resolving their container name. So we can talk to our local DynamoDB Docker container from inside our lambda containers managed by SAM. The DynamoDB container is called “dynamodb”.

When the container is up and running we can create our DynamoDB table. The primary key of our DynamoDB will consist of a parition key (documentId) and a sort key (versionId).

Use SAM to Build and Invoke a Local Function

Now clone the demo project from GitHub.

This project contains which fully describes what the stack should look like. It deploys one standalone lambda function called which we trigger manually to load data in the DynamoDB table. The second function is called GetDocumentFunction and will be triggered by an API Gateway event.

We can validate if the template is valid:

The LoadDataFunction will be used to fill our DynamoDB table. Build the functions using SAM CLI. If you don’t have Python3.7 installed on your local environment you can use the parameter to build the function inside a Docker container.

This builds our lambda functions as described in our template.yaml. SAM templates are an extension on CloudFormation templates. The build will download the necessary dependencies, described in our and create deployment artifacts stored in .

The only needs to be executed once to put the existing documents and their locations in our DynamoDB table. Let’s take a closer look at the function itself:

Based on the value of of the env var Environment we will connect to our local or remote DynamoDB

We can use SAM to invoke the function. We use a parameter to point to our environment. Valid options are or . The lambda function connects with the correct DynamoDB endpoint based on that parameter.

The local lambda function will run inside a Docker container. We tell SAM to spin up the container inside the same Docker Bridge network as our DynamoDB container. Now our lambda can communicate with the DynamoDB container using its container name.

When the lambda function ends successfully we can see whether the correct data is in our DynamoDB table. Scan the content of the table:

We see a document with documentId 1044 and version v_1 is available at s3://bucket-a/8853806831.doc

Use SAM to Generate a Sample Payload

The next function is . This is triggered by an API Gateway event.

The API Gateway setup is defined in the . We deploy an API Gateway lambda proxy integration where the will be triggered when a GET request is made to .

Before we deploy this API Gateway endpoint I want us to test the . Here for I have to create a valid event which can trigger this function. The following command will create a valid JSON which we can use as a fake API Gateway event to trigger the lambda.

Unfortunately, SAM CLI has no support (yet?) for during event generation. So we should update this manually in the . Remember, we saw a document with and , so we can use these as valid parameter values:

Now test the by invoking it using our event:

The correct storage location for the document with documentId 1044 and versionId v_1 is showed

Set Up Local API Gateway Endpoint With Lambda Proxy Integration

Employees of the company will use a static website to perform a valid GET request to the API Gateway. Employees fill in the and of the document they need.

The API Gateway will forward the request to our . The will use the to query the DynamoDB table for the correct location.

Start our local API Gateway endpoint:

It runs on

The repository contains a basic static webpage as visualization. The employees will use this to talk to the API Gateway. The API Gateway talks to the back end which is our .

This will actually perform the following GET call:

Now click submit query. This will trigger our lambda.

The lambda searches for the corresponding location of the document in the local DynamoDB
The location of the document is displayed

Now I showed how we could build and test our stack locally. After making changes to your template or lambdas you should run the command again and reinvoke or redeploy your resources.

That’s it! If you have an AWS account available you can go to the next step.

Deploy SAM stack to AWS (Optional)

Now I’ll show you how easy it is to deploy this stack to AWS.

We already ran the command. This created our deployment artifacts in the directory.

packages and deploys our stack. We have to specify an S3 bucket where we upload our deployment artifacts. Note that we’re now pointing to AWS as environment instead of . Now our lambdas know that they have to connect to AWS’s DynamoDB endpoint instead of our local endpoint.

After a successful deployment we should first run to fill our DynamoDB. Just create an empty test event in the lambda console and execute the function:

The correct data is available in DynamoDB:

Now we can test our API Gateway lambda proxy integration again:

Everything works! At last, we can redeploy our static website in an S3 bucket which is created by our stack. In this demo I use public-static-site-bucket as the bucket. The template will generate a random bucket name, configured as a static website.

We just have to update our static application to point to the correct API Gateway URL. To find this URL you can go to CloudFormation. Search for your stack and check the outputs. The stack is configured to output our API Gateway endpoint.

To test quickly we can use and the correct (or incorrect) parameters.

Now we can update our static application with this URL:

Now that we’ve updated the URL we can upload the files to our S3 bucket. Be sure to make the objects public during or after the upload.

Visit your S3 bucket as a static website!

The correct URL is also output in the CloudFormation console.


We discovered the magic of AWS Serverless Application Model.

We went through most of the important things you can do with it. We used a SAM template to deploy and test lambda functions on our local machine. We even deployed a local API Gateway! After we verified that everything worked we deployed the exact same stack on AWS.

I hope you enjoyed it. Thank you for reading and feel free to ask any questions!

Better Programming

Advice for programmers.

Thanks to Zack Shapiro

Lorenz Vanthillo

Written by

DevOps | Docker Certified Associate | 3x AWS Certified

Better Programming

Advice for programmers.

More From Medium

More from Better Programming

More from Better Programming

More from Better Programming

Welcome to a place where words matter. On Medium, smart voices and original ideas take center stage - with no ads in sight. Watch
Follow all the topics you care about, and we’ll deliver the best stories for you to your homepage and inbox. Explore
Get unlimited access to the best stories on Medium — and support writers while you’re at it. Just $5/month. Upgrade