How to Deploy a Local Serverless Application With AWS SAM
Get started with Serverless, Labda, and Docker
The main goal of this article is to familiarize you with AWS SAM so you can test your lambdas on your local machine without the need to (re)deploy them every time on AWS. I’ll discuss different integrations and examples with SAM and lambda.
SAM uses Docker to build and run lambdas locally, so a basic knowledge of Docker is required.
Let’s say there’s a company with a lot of documents. These documents have a
documentId and a
versionId. There are also multiple versions of one document.
The employees of the company find it difficult to locate specific versions of documents. The company has developed a solution to make it easy to request the storage location of a document based on its
Let’s translate the use case to a technical architecture in AWS. We will use a DynamoDB to store the
versionId and storage location of a document. We need to populate the DynamoDB table with our existing document information and then set up lambda proxy integration in API Gateway. An end-user will be able to retrieve the correct storage location of a document by performing a GET request with the correct parameters (
versionId) to the API Gateway endpoint.
We can use SAM to deploy this stack in AWS, but one of the strengths of SAM is that it offers a straightforward way to test your integrations locally.
We will use SAM local to:
- Invoke a lambda once using SAM CLI to populate DynamoDB.
- Host our local API Gateway endpoint with lambda proxy integration.
- Generate a sample event that API Gateway sends to our lambda function when someone searches for a document.
When everything works locally we will deploy the full stack in AWS.
To make it possible to follow this demo you’ll need the following prerequisites:
We will use Docker to run our local DynamoDB instance — but not just that.
SAM is also highly dependent on Docker. When a local lambda function is invoked by SAM CLI, SAM will start a Docker container, execute the lambda code and destroy the container.
Let’s first create a Docker bridge network. We’ll use this type of Docker network so that our Docker containers can communicate with each other by resolving their container name. So we can talk to our local DynamoDB Docker container from inside our lambda containers managed by SAM. The DynamoDB container is called “dynamodb”.
$ docker network create sam-demo
$ docker run --network sam-demo --name dynamodb -d -p 8000:8000 amazon/dynamodb-local
When the container is up and running we can create our DynamoDB table. The primary key of our DynamoDB will consist of a parition key (documentId) and a sort key (versionId).
$ aws dynamodb create-table --table-name documentTable --attribute-definitions AttributeName=documentId,AttributeType=N AttributeName=versionId,AttributeType=S --key-schema AttributeName=documentId,KeyType=HASH AttributeName=versionId,KeyType=RANGE --provisioned-throughput ReadCapacityUnits=5,WriteCapacityUnits=5 --endpoint-url http://localhost:8000
Use SAM to Build and Invoke a Local Function
Now clone the demo project from GitHub.
This project contains
template.yaml which fully describes what the stack should look like. It deploys one standalone lambda function called
LoadDataFunction which we trigger manually to load data in the DynamoDB table. The second function is called GetDocumentFunction and will be triggered by an API Gateway event.
We can validate if the template is valid:
$ cd aws-lambda-sam-demo
$ sam validate -t template.yml
xxx/template.yaml is a valid SAM Template
The LoadDataFunction will be used to fill our DynamoDB table. Build the functions using SAM CLI. If you don’t have Python3.7 installed on your local environment you can use the
--use-container parameter to build the function inside a Docker container.
$ sam build --use-container
This builds our lambda functions as described in our template.yaml. SAM templates are an extension on CloudFormation templates. The build will download the necessary dependencies, described in our
requirements.txt and create deployment artifacts stored in
LoadDataFunction only needs to be executed once to put the existing documents and their locations in our DynamoDB table. Let’s take a closer look at the function itself:
We can use SAM to invoke the function. We use a parameter to point to our environment. Valid options are
aws. The lambda function connects with the correct DynamoDB endpoint based on that parameter.
The local lambda function will run inside a Docker container. We tell SAM to spin up the container inside the same Docker Bridge network as our DynamoDB container. Now our lambda can communicate with the DynamoDB container using its container name.
$ sam local invoke LoadDataFunction --parameter-overrides ParameterKey=Environment,ParameterValue=local ParameterKey=DDBTableName,ParameterValue=documentTable --docker-network sam-demo
When the lambda function ends successfully we can see whether the correct data is in our DynamoDB table. Scan the content of the table:
$ aws dynamodb scan --table-name documentTable --endpoint-url http://localhost:8000
Use SAM to Generate a Sample Payload
The next function is
GetDocumentFunction. This is triggered by an API Gateway event.
The API Gateway setup is defined in the
template.yaml. We deploy an API Gateway lambda proxy integration where the
GetDocumentFunction will be triggered when a GET request is made to
Before we deploy this API Gateway endpoint I want us to test the
GetDocumentFunction. Here for I have to create a valid event which can trigger this function. The following command will create a valid JSON which we can use as a fake API Gateway event to trigger the lambda.
$ sam local generate-event apigateway aws-proxy --method GET --path document --body "" > local-event.json
Unfortunately, SAM CLI has no support (yet?) for
queryStringParameters during event generation. So we should update this manually in the
local-event.json. Remember, we saw a document with
documentId 1044 and
version_id v_1 in DynamoDB, so we can use these as valid parameter values:
Now test the
GetDocumentFunction by invoking it using our event:
sam local invoke GetDocumentFunction --event local-event.json --parameter-overrides ParameterKey=Environment,ParameterValue=local ParameterKey=DDBTableName,ParameterValue=documentTable --docker-network sam-demo
Set Up Local API Gateway Endpoint With Lambda Proxy Integration
Employees of the company will use a static website to perform a valid GET request to the API Gateway. Employees fill in the
versionId of the document they need.
The API Gateway will forward the request to our
GetDocumentFunction will use the
queryStringParameters to query the DynamoDB table for the correct location.
Start our local API Gateway endpoint:
$ sam local start-api --parameter-overrides ParameterKey=Environment,ParameterValue=local ParameterKey=DDBTableName,ParameterValue=documentTable --docker-network sam-demo
The repository contains a basic static webpage as visualization. The employees will use this to talk to the API Gateway. The API Gateway talks to the back end which is our
This will actually perform the following GET call:
Now click submit query. This will trigger our lambda.
Now I showed how we could build and test our stack locally. After making changes to your template or lambdas you should run the
sam build command again and reinvoke or redeploy your resources.
That’s it! If you have an AWS account available you can go to the next step.
Deploy SAM stack to AWS (Optional)
Now I’ll show you how easy it is to deploy this stack to AWS.
We already ran the
sam build command. This created our deployment artifacts in the
sam deploy packages and deploys our stack. We have to specify an S3 bucket where we upload our deployment artifacts. Note that we’re now pointing to AWS as environment instead of
local. Now our lambdas know that they have to connect to AWS’s DynamoDB endpoint instead of our local endpoint.
$ sam deploy --template-file .aws-sam/build/template.yaml --s3-bucket lvthillo-sam-upload-bucket --parameter-overrides ParameterKey=Environment,ParameterValue=aws ParameterKey=DDBTableName,ParameterValue=documentTable --stack-name aws-lambda-sam-demo --capabilities CAPABILITY_NAMED_IAM
After a successful deployment we should first run
LoadDataFunction to fill our DynamoDB. Just create an empty test event in the lambda console and execute the function:
The correct data is available in DynamoDB:
Now we can test our API Gateway lambda proxy integration again:
Everything works! At last, we can redeploy our static website in an S3 bucket which is created by our stack. In this demo I use public-static-site-bucket as the bucket. The template will generate a random bucket name, configured as a static website.
We just have to update our static application to point to the correct API Gateway URL. To find this URL you can go to CloudFormation. Search for your stack and check the outputs. The stack is configured to output our API Gateway endpoint.
To test quickly we can use
curl and the correct (or incorrect) parameters.
$ curl "https://c8517wsgol.execute-api.eu-west-1.amazonaws.com/v1/document?documentId=1044&versionId=v_1"
Now we can update our static application with this URL:
Now that we’ve updated the URL we can upload the files to our S3 bucket. Be sure to make the objects public during or after the upload.
Visit your S3 bucket as a static website!
The correct URL is also output in the CloudFormation console.
We discovered the magic of AWS Serverless Application Model.
We went through most of the important things you can do with it. We used a SAM template to deploy and test lambda functions on our local machine. We even deployed a local API Gateway! After we verified that everything worked we deployed the exact same stack on AWS.
I hope you enjoyed it. Thank you for reading and feel free to ask any questions!