Building Serverless REST API Using Serverless Framework

Paul Zhao
Paul Zhao Projects
Published in
15 min readJul 21, 2020

Step-by-step guide to build a serverless REST API in AWS using Serverless Framework

As we start off our project, the listed tools and services must be in place.

1 An AWS account (Logging in as an admin level user rather than a root)

2 A code editor, Visual Studio Code is recommended

3 POSTMAN — API Development Tool

4 You may need to purchase a domain name in AWS Route 53, or you have an exiting domain name to use for this project

Please follow instructions below to log in AWS as an admin level user.

Creating a non-root user

Based on AWS best practice, root user is not recommended to perform everyday tasks, even the administrative ones. The root user, rather is used to to create your first IAM user, groups and roles. Then you need to securely lock away the root user credentials and use them to perform only a few account and service management tasks.

Notes: If you would like to learn more about why we should not use root user for operations and more about AWS account, please find more here.

Login as a Root user
Create a user under IAM service
Choose programmatic access
Attach required policies
Create user without tags
Keep credentials (Access key ID and Secret access key)

As we have all prerequisites met, we move on to our project!

Creating the Backend DynamoDB Table Using the Serverless Template

First of all, let us dive in our serverless.yml file.

serverless.yml
provider:
name: aws
runtime: nodejs8.10
region: us-west-2
stage: prod
memorySize: 128
timeout: 5
endpointType: regional
environment:
NOTES_TABLE: ${self:service}-${opt:stage, self:provider.stage}
iamRoleStatements:
- Effect: Allow
Action:
- dynamodb:Query
- dynamodb:PutItem
- dynamodb:DeleteItem
Resource: "arn:aws:dynamodb:${opt:region, self:provider.region}:*:table/${self:provider.environment.NOTES_TABLE}"

Here, we add environment to extract our DynamoDB table with a name NOTES=TABLE and variable ${self:service}-${opt:stage, self:provider.stage}

resources:
Resources:
NotesTable:
Type: AWS::DynamoDB::Table
DeletionPolicy: Retain
Properties:
TableName: ${self:provider.environment.NOTES_TABLE}
AttributeDefinitions:
- AttributeName: user_id
AttributeType: S
- AttributeName: timestamp
AttributeType: N
- AttributeName: note_id
AttributeType: S
KeySchema:
- AttributeName: user_id
KeyType: HASH
- AttributeName: timestamp
KeyType: RANGE
ProvisionedThroughput:
ReadCapacityUnits: 1
WriteCapacityUnits: 1
GlobalSecondaryIndexes:
- IndexName: note_id-index
KeySchema:
- AttributeName: note_id
KeyType: HASH
Projection:
ProjectionType: ALL
ProvisionedThroughput:
ReadCapacityUnits: 1
WriteCapacityUnits: 1

Then, we define our DynamoDB table with resources , including DeletionPolicy , Properties . In AttributeDefinitions , we define our primary and secondary keys. Followed by KeySchema, ProvisionedThroughput, GlobalSecondaryIndexes

After this, we are all set.

Notes: Prior to creating our DynamoDB table, we need to have providers for AWS, shown below

$ export AWS_ACCESS_KEY_ID=AKIAXXXXXXXXXX5AE4
$ export AWS_SECRET_ACCESS_KEY=ws5ZXXXXXXXXXXXXXXXXXaOxX
$ export AWS_DEFAULT_REGION=us-east-1 ###(This option depends on in which region you would like to deploy your resources)

How could you generate this credentials to get access to AWS account?

Please follow instructions below

Creating a non-root user to get access to AWS programmatically

Based on AWS best practice, root user is not recommended to perform everyday tasks, even the administrative ones. The root user, rather is used to to create your first IAM user, groups and roles. Then you need to securely lock away the root user credentials and use them to perform only a few account and service management tasks.

Login as a Root user
Create a user under IAM service
Choose programmatic access
Attach required policies
Create user without tags
Keep credentials (Access key ID and Secret access key)

Notes: The credentials for you will not be shown after its creation to maintain the best practice of AWS, you may download .csv file

Now, we are ready to deploy our DynamoDB table.

To install Serverless Framework

$ npm install -g serverless

To verify Serverless installation

$ serverless --version
Framework Core: 1.75.1
Plugin: 3.6.16
SDK: 2.3.1
Components: 2.32.0

Notes: In case, you encounter problem while installing serverless framework, you may try to implement following code. In my case, it resolved the issue.

$ npm config set strict-ssl false
$ npm config set NODE_TLS_REJECT_UNAUTHORIZED 0

Finally, we deploy our DynamoDB table

$ sls deploy
Serverless: Packaging service...
Serverless: Creating Stack...
Serverless: Checking Stack create progress...
........
Serverless: Stack create finished...
Serverless: Uploading CloudFormation file to S3...
Serverless: Uploading artifacts...
Serverless: Validating template...
Serverless: Updating Stack...
Serverless: Checking Stack update progress...
......
Serverless: Stack update finished...
Service Information
service: sls-notes-backend
stage: prod
region: us-west-2
stack: sls-notes-backend-prod
resources: 3
api keys:
None
endpoints:
None
functions:
None
layers:
None

We may also login to our AWS console to verify creation.

DynamoDB table

Notes: Keep in mind, DynamoDB is region specific, so you may locate our DynamoDB table under us-west-2 since we export AWS region as it.

Setting up the GIT Repository for Source Control with CodeCommit

Here, we would apply Git as a repo to update changes automatically as it occurs

In our directory (repo here), we git init and create 2 branches master and dev , then we add it using git add , finally we apply git commit -am “initial commit”

Now move to AWS console to create our CodeCommit

CodeCommit

Create the repo

Repo Creation

Successfully created repo

Repo created

To set our CodeCommit as an origin, we need to copy our repo

Repo

Jump back to our command line, we set up our CodeCommit as origin

$ git remote add origin https://git-codecommit.us-west-2.amazonaws.com/v1/repos/sls-backend-notes-repo
$ git remote -v
origin https://git-codecommit.us-west-2.amazonaws.com/v1/repos/sls-backend-notes-repo (fetch)
origin https://git-codecommit.us-west-2.amazonaws.com/v1/repos/sls-backend-notes-repo (push)

Then we would push it. Since it is the very first time push to origin dev, we need to apply command line shown below

$ git push --set-upstream origin dev
Enumerating objects: 15, done.
Counting objects: 100% (15/15), done.
Delta compression using up to 8 threads
Compressing objects: 100% (14/14), done.
Writing objects: 100% (15/15), 13.50 KiB | 1.69 MiB/s, done.
Total 15 (delta 4), reused 0 (delta 0), pack-reused 0
To https://git-codecommit.us-west-2.amazonaws.com/v1/repos/sls-backend-notes-repo
* [new branch] dev -> dev
Branch 'dev' set up to track remote branch 'dev' from 'origin'.

Then, we would do the same for our master branch

$  git checkout -b master
Switched to a new branch 'master'
$ git push --set-upstream origin master
Total 0 (delta 0), reused 0 (delta 0), pack-reused 0
To https://git-codecommit.us-west-2.amazonaws.com/v1/repos/sls-backend-notes-repo
* [new branch] master -> master
Branch 'master' set up to track remote branch 'master' from 'origin'.

We can also verify our push in AWS console as shown below

Dev and master branches

Notes: If using a mac, you may experience connection issue to AWS CodeCommit. For detailed instruction, please visit following link. You may need to adjust Keychain Access.

For untracked files in git, you may consider to remove it

Notes: Remove any unwanted untracked files in git, simply apply rm -rf <file name>

Now we will move on to next step

Setting up the CI/ CD Pipeline for Deployment Automation

First, we navigate to AWS console to build up CodePipeline

CodePipeline

We start to create the pipeline

Pipeline creation

Pipeline settings

Pipeline settings

Add source and select CodeCommit repo we created previously

CodeCommit as source

Then , we move to build stage, please press create project to navigate to build

Notes: CodeBuild requires elevated permission to execute, it is recommended to give admin level of permission to ensure you are good to go

Build stage

CodeBuild page

CodeBuild page

CodeBuild created and don’t forget to add an environment variable as

name: ENV_NAME

value: prod

CodeBuild created

Next, we will leave deploy stage empty for now

Deploy stage

Notes: buildspec.yml file needs to be provided prior to creating the CodePipeline. Otherwise, creation will be failed.

CodePipeline failed

buildspec.yml

version: 0.2

phases:
install:
commands:
- echo Installing Serverless...
- npm install -g serverless
pre_build:
commands:
- echo Install source NPM dependencies...
- npm install
build:
commands:
- echo Deployment started on `date`
- echo Deploying with the Serverless Framework
- sls deploy -v -s $ENV_NAME
post_build:
commands:
- echo Deployment completed on `date`

There are 4 stages in our buildspec.yml file.

First, we install our Serverless tool

Second, we install all dependencies during pre-build

Third, during our build process, display when deployment started and indicate Deploying with Serverless Framework. Then execute our ENV_NAME set up during our CodeBuild process

Lastly, in the post build stage, display when the deployment was complete

If everything goes right, you should see CodePipeline is built successfully

CodePipeline built

Creating the boilerplate Lambda Functions

We will build a folder named api in our folder as shown below with a few files under it

Api folder

add-note.js to add note to DynamoDB table

update-note.js to change the note in DynamoDB table

get-note.js to read the note from DynamoDB table

get-notes.js to read multiple notes from DynamoDB table

delete-notes.js to delete note from DynamoDB table

util.js to provide util.js, which contains a number of utility functions to help you update your web pages with javascript data

We update our plugins, iamRoleStatements, functions of our severless.yml file as shown below

Testing the REST API Locally using the Serverless Offline Plugin

$ sls offline
Serverless: Starting Offline: prod/us-west-2.
Serverless: Routes for add-note:
Serverless: POST /note
Serverless: Routes for update-note:
Serverless: PATCH /note
Serverless: Routes for get-notes:
Serverless: GET /notes
Serverless: Routes for get-note:
Serverless: GET /note/n/{note_id}
Serverless: Routes for delete-note:
Serverless: DELETE /note/t/{timestamp}
Serverless: Offline listening on http://localhost:3000

To test in our local environment, you may require POSTMAN.

Let’s first create the note.

Choose POST function and paste into our local url http://localhost:3000/note

Post function

Save this post function as a new collection named SLS Notes Backend API

Save function as a new collection

Upon sending, we will receive error as shown below because user_id and user_name are missing under headers, values for both should be provided as well test_user and Test User

Error after sending

Notes: Choose raw and JSON under Body

User_id and user_name provided as headers

After that, we are able to send it successfully

Successful post function

We may also test it in DynamoDB in AWS console

DynamoDB table item

Then we create a patch function as shown below by adding

“timestamp”: 1595195327,

“note_id”: “test_user:d60051ff-858a-4bd2–8139-a1d5ca7b36b8”,

Patch function

Also, we may verify update in DynamoDB table itme

Updated dynamoDB table

Then, let’s generate 7 more get functions and send them out

2
3
4
5
6
7
8

Now we will have our get function ready

Create a get function

Get function creation

Back to our terminal

$ sls offline

404 error since we’ve got wrong path for our get notes function

404

Path corrected and no error returned

Path corrected

Delete function is the next one to be tested

First we will create a deletion function and save it

Delete function creation

Then we copy timestamp of note 6 shown below

Note 6 to delete

In the url section, we type in as following

http://localhost:3000/notes/t/1595195978

Then we send it

Url edit and delete

From get notes, we find note 6 is no longer available

Note 6 gone

To validate our deletion, we jump back to AWS console DynamoDB table item and find note 6 is not found

DynamoDB note 6 gone

Next, we will test to read our note by providing note_id. We copy and paste note_id for note 7 into our url as shown below. Then we get response as note 7

Copy note 7 note_id
Response note 7

Deploying the REST API to the AWS Cloud

We use git to git add . git commit git push for both dev and master branch

As soon as we push our master branch to AWS CodeCommit, CodePipeline will automatically start building up our structure.

Successful codepipeline creation

Then we will verify our resources build up

Lambda Functions

Lambda functions

And API Gateway

API gateway

Now let us test get notes function, copy url of prod get function

Url for get notes function

Paste url in and create a new function in POSTMAN as PROD GET /note

New function

Test it

Response received

Adding a Custom Domain to the API using the Serverless Framework

We need to update plugins by adding a

- serverless-domain-manager

Also, customDomain section is added under custom

In the meantime, you need to add serverless-domain-manager by applying command line as shown below

$ npm install --save-dev serverless-domain-manager

Setting up a Custom Domain using Route53

Here you may purchase a domain name or apply any of your pre purchased domain name.

Domain name records

You may test your domain as well

Test domain name

Adding SSL Certificate to the Custom Domain using ACM

We now move back to AWS console and navigate to ACM certificate manager

Notes: here we need to make sure we are in us-east-1 region since it is where the edge location is

We create a certificate by pressing get started with Provision certificates.

Request a certificate

Request a certificate

Provide your domain name and a domain name with a wildcard at front to include sub-domains

Domain names provided

DNS validation choosen

DNS validation selected

Add tags as what it was created in our serverless file named as certificateName: devopspaulzhao.com

Tag added

Review page

Review page

Request a certificate pending

Pending stage for ceritifate

Notes: Keep in mind, you need to create a CNAME for each domain created

Certificate Manager issued
CNAME created successfully
CNAME created

Creating the API Gateway Custom Domain using the Serverless Framework

We move back to our command line and apply sls create_domain to create custom domain names in API Gateway

$ sls create_domain

Error --------------------------------------------------

Error: Error: 'devopspaulzhao.com' was not created in API Gateway.
ConfigError: Missing region in config
at /Users/paulzhao/Desktop/sls-notes-backend/sls-notes-backend/node_modules/serverless-domain-manager/index.js:108:15
at processTicksAndRejections (internal/process/task_queues.js:97:5)

For debugging logs, run again after setting the "SLS_DEBUG=*" environment variable.

Get Support --------------------------------------------
Docs: docs.serverless.com
Bugs: github.com/serverless/serverless/issues
Issues: forum.serverless.com

Your Environment Information ---------------------------
Operating System: darwin
Node Version: 12.18.0
Framework Version: 1.75.1
Plugin Version: 3.6.16
SDK Version: 2.3.1
Components Version: 2.32.0

Notes: After export AWS_REGION=us-west-2 , the error was resolved.

Tips: Be careful about the format in the file like shown below. Otherwise, you may encounter errors.

custom:
allowedHeaders:
- Accept
- Content-Type
- Content-Length
- Authorization
- X-Amz-Date
- X-Api-Key
- X-Amz-Security-Token
- X-Amz-User-Agent
- app_user_id
- app_user_name
customDomain: ### CustomDomain should be under custom, not going parallel with custom
domainName: devopspaulzhao.com
basePath: 'v1'
stage: ${self:provider.stage}
certificateName: devopspaulzhao.com
createRoute53Record: true

It should rather be

custom:
allowedHeaders:
- Accept
- Content-Type
- Content-Length
- Authorization
- X-Amz-Date
- X-Api-Key
- X-Amz-Security-Token
- X-Amz-User-Agent
- app_user_id
- app_user_name
customDomain:
domainName: devopspaulzhao.com
basePath: 'v1'
stage: ${self:provider.stage}
certificateName: devopspaulzhao.com
createRoute53Record: true

Now we create custom domain names

$ sls create_domain
Serverless: 'devopspaulzhao.com' was created/updated. New domains may take up to 40 minutes to be initialized.

Verify custom domain names in AWS console

Custom domain name

CodePipeline successfully build A record for our domain name

CodePipeline success
A record created

Live API in Action

Create an environment as DEV in environment as shown below

DEV

Create an environment as PROD in environment as shown below

PROD

Test {{ENV_NAME}} under PROD

PROD test

Test {{ENV_NAME}} under DEV

DEV test

Verify DynamoDB table item

DynamoDB table item

Conclusion:

As we conclude our project, we will recap what built up throughout our project.

We kicked off our project by building up our backend DanymoDB table using the Serverless Template. Then we set up CodeCommit as our remote repo. Then we created a CI/ CD CodePipeline in automatic manner, which meant any changes we pushed from our local repo to CodeCommit using git will trigger the pipeline to build.

After that, we set up our Lambda Function framework and built up a number of functions. As we added IAM Role Statements, Lambda Functions as well as API Endpoints to serverless.yml file.

Then we moved on to testing stage. We tested API in local environment using POSTMAN prior to deploying using it in AWS.

Moreover, we added custom domain resources to our serverless.yml file and pushed custom domain to AWS API using serverless framework command line. (Serverless framework is preferred in comparison to AWS console due to its simplicity and swiftness). We then added SSL certificate to our custom domain using ACM to provide better security. Ultimately, API Gateway Custom Domain was built up using serverless framework.

Finally, we tested API in action.

Throughout our project, we witnessed the power of deployment using serverless network. With command line, we avoided jumping from one AWS service to another and sped up our whole building process. In the meantime, we realized the importance of testing since typos and other human errors may occur during building process. With testing, error-free deployment to production is achieved with ease.

--

--

Paul Zhao
Paul Zhao Projects

Amazon Web Service Certified Solutions Architect Professional & Devops Engineer