Continuously deploying Serverless Applications

Sabarish Sekar
Devjam
Published in
9 min readApr 28, 2020

Introduction:

Recently I started to work on AWS Serverless solutions (Lambda, Api Gateway and DynamoDb) for one of my personal projects and wanted to have a CI/CD setup to streamline my test and deployment of the lambdas. Thanks to the projects I got involved in, I have learned and personally experienced the benefits of CI/CD pipelines to enable a faster delivery cycle. I have used Jenkins and Gitlab in my professional projects and for this project, I took a look around the available AWS services that can help me achieve the same behaviour. In this blog I am going to take you through the services (eg.AWS CodePipeline) from AWS to setup a CI/CD pipeline with build, tests and release phases. I will also add an approval step between the test and deploy phase so I can choose when to deploy updates to the lambda.

CI/CD pipeline using AWS Services to deploy Lambda and Api Gateway

Benefits of CI/CD:

With a proper Continuous Integration (CI) / Continuous delivery (CD) pipeline in place

A typical CI/CD Work flow
  • Every commit made to the version controlled code base will be built and tested
  • Regression bugs can be caught early and fixed faster
  • Faster integration of features
  • Deployable artifacts created at the end of each successful run
  • Artifacts can then be deployed when deemed fit for release

It’s vital to have a CI/CD pipeline for any project to reap these benefits.

Example Application:

To set up a CI/CD we need an example application to start with :) The example application will have the following

  • An AWS lambda that calculates the area of a circle given its radius (yes I am keeping it really simple)
  • Api gateway will be attached to the lambda and it can be triggered using a simple REST call with the radius as the query parameter

I will also use AWS Serverless Application Model (SAM) to create the lambda and the Api Gateway resources.

CodePipeline:

AWS CodePipeline

To build the CI/CD pipeline I will use AWS CodePipeline Service. CodePipeline is a continuous delivery service for building, testing and releasing the software. It enables you to model and visualise the entire process using services such as AWS CodeCommit, AWS CodeBuild and AWS CodeDeploy. I will briefly discuss these services a little bit later.

CodePipeline is composed of Stages and Actions.

  • A stage is comprised of one or more actions
  • An action is a task performed on a revision in that stage
  • A group of stages combines to form the pipeline
  • Stages in the pipeline are connected via transition

To put these into perspective let’s take our project requirement for CI/CD and break them into possible CodePipeline stages and actions.

For this pipeline, I have decided to have three stages.

Stage -1

  • Source Stage
  • Action — Actively poll the repository to check if there are new commits, if there is a new commit then check out source code and trigger the pipeline.

Stage -2

  • Build Stage
  • Action — Build and run unit test

Stage -3

  • Approve Deploy Stage
  • Action — Manual approval action
  • Action — Deploy the lambda once the approval is given

The necessary stages have been modelled, let’s start creating them.

CodePipeline can be created directly from the AWS web console or using the AWS cloud formation Api. I have used AWS Cloud Development Kit to define and deploy the example pipeline. CDK enables you to write the Infrastructure as Code in your favourite language of choice. CDK then compiles and generates the equivalent Cloudformation template for deploying. This itself can be a nice blog for another time :)

Here is the GitHub link to the pipeline cdk project that can be deployed to your account.

I have attached a snippet of the pipeline defined using CDK below

Pipeline defined with CDK

The created pipeline should look like this

Created CodePipeline in AWS console

Source Stage:

The first step will be to version control our code so that we can keep track of our changes and incrementally push updates to it. I am going to use AWS CodeCommit for this.

AWS CodeCommit

AWS CodeCommit is a fully managed source control service from AWS for hosting git repositories (think like Github). It’s a free private repository and only accessible from your AWS account.

Let’s start by creating a new repository in CodeCommit and name it circle-area-calculator and let’s upload the code from (Github link) to this repository.

CodePipeline supports AWS CodeCommit, Github, Bitbucket S3 and few other AWS services as allowed source providers. Once configured, CodePipeline monitors the repository and the particular branch (I have set it to the master branch) for new commits and triggers the pipeline.

Build Stage:

The next stage in the pipeline is to build and test the code on every commit to the repo. This is where AWS CodeBuild Service comes into picture.

AWS CodeBuild

AWS CodeBuild is a fully managed and highly scalable build service offering from AWS. Think like a Gitlab runner / Jenkins worker node which builds your software, runs tests against it and generates the deployable artifact.

Note: It is also possible now to use code build as a worker node for your Jenkins server.

Build Definition:

Similar to .gitlab-ci.yaml in Gitlab / pipeline.yaml in Jenkins to define the pipeline definitions and the necessary build steps for your project, In CodeBuild it’s called buildspec.yml file. I will create one and add it as a part of the source repository. CodeBuild will then use it to run the build.

The name of the file is not restricted to only buildspec.yaml. The CodeBuild project can be easily configured to point to the file containing the build definition with different names. For instance: while deploying to multiple environments, it is common to have buildspec file appended with the name of the respective deploy environment like buildpsec_staging.yaml / buildspec_prod.yml. CodeBuild will look for a file named buildspec.yaml if there is no additional configuration.

Let’s take a look at the buildsepc.yml file for building and testing our application.

version: 0.2

phases:
install:
runtime-versions:
nodejs: 10
commands:
- npm install -g mocha
build:
commands:
- sam build --build-dir build
- npm install --prefix circle-calculator
- npm run test --prefix circle-calculator

artifacts:
files:
- build/**/*
- buildspec_deploy.yml

phases lists the different phases in the build process. There are few predefined phases such as install, pre_build, build, post_build.

Each phase will have a commands section that can be used to run a set of commands. And each phase can have a finally block with set of commands, that always runs after the commands block is completed

Phases provide a nice way of organising the set of actions in our build process. Note that phases are optional, so there is also flexibility to choose the relevant one for your needs and organise your build steps.

In the above buildpsec.yml, I will be using the install and build phase.

The install phase is used to install the necessary dependencies required to build and run our tests. It’s also possible to choose from a list of programming language runtimes based on the needs of the application. There are several runtimes with different versions offered by AWS such as python, java, go, node etc.

In the build phase I will run the unit tests and use SAM build to generate build artifacts.

Finally, with the artifacts I will store the output of the SAM build command. Things stored in the artifacts can be shared to the next stages in CodePipeline. CodeBuild automatically stores the artifacts mentioned in an AWS S3 bucket.

Note: S3 (Simple Storage Service) is an object storage service offering from AWS. It’s a highly scalable and highly durable object store. S3 offers 99.99999999999 of durability, that’s 11 nines of durability (I hope I have 11 nines here phew..). To give an example of S3’s scalability, until few years back, Dropbox was leveraging S3 for its storage. That’s a lot of documents and photos stored in the cloud!

Deploy Stage:

The deploy stage will have two actions that are configured to run sequentially. The first action will be a Manual Approval which will block the next action from running until approval is given.

The next action will be deploying the lambda and the api gateway. We will again use the CodeBuild service here and have the build definition in a file called buildpsec_deploy.yaml.

version: 0.2

phases:
install:
runtime-versions:
nodejs: 10

build:
commands:
- ls -al
- find .
- sam deploy --template-file build/template.yaml --stack-name circular-area-calculator --s3-bucket circle-area-calculator-artifacts --capabilities CAPABILITY_IAM --region eu-central-1

In this stage I am going to deploy the changes that got approved. When you look at the logs, you can see SAM creating a cloudformation change set and then executing the changes. Once finished it will display the api gateway url generated. The logs should be similar as below.

Cloudformation Stack output after a successful run

Trying the url in a browser, the area of the circle should be returned as response.

Let’s make a change to our circle calculator

I will add a new functionality to our lambda to also calculate the diameter of the circle given its radius. For this, I have added an additional query parameter called type to distinguish between diameter and radius.

The new change should look like this

let response;

exports.lambdaHandler = async (event, context) => {
const radius = event.queryStringParameters.radius;
const type = event.queryStringParameters.type;
let value;
if (type === 'area') {
value = Math.PI * (Math.pow(radius, 2));
} else if (type === 'diameter') {
value = 2 * radius;
}
try {
response = {
'statusCode': 200,
'body': JSON.stringify({
type: type,
result: value,
})
}
} catch (err) {
console.log(err);
return err;
}

return response
};

I have also added these changes in a new branch (New-feature-calculate-diameter) of the repo. If you would like to try, merge this branch to the master and push it to the CodeCommit repo. The pipeline will be triggered and once it reaches the approval stage, give the approval and then the new changes will be deployed.

Let’s test the new update by making a new request to calculate the circle diameter. The result should look like this.

Conclusion:

I have now finished setting up a CI/CD pipeline using AWS CodePipeline and CodeBuild services to deploy the lambdas and can reap the benefits of having a CI/CD. Every commit to the repository is now built, tested and can be deployed with a push of a button. I have offered a brief overview of the AWS services that are used in this blog. I highly encourage you to take a look at them in detail and follow the suggested practices for best results.

Caution:

Creating the pipeline and by deploying the lambda in your AWS account, you might incur charges based on the usage in your account. If you like to deploy the solutions in the blog, please keep this in mind.

For the purpose of the blog, the pipeline uses IAM roles that are broadly scoped as having fullAccess to S3, CloudFormation, IAM, lambda and Api gateway. As a best practice, once you have finished experimenting with the solution, make sure to run the following commands to delete your pipeline and the CloudFormation stack created by AWS SAM for lambda deployment.

To delete the pipeline, run the command in the directory of downloaded CodePipeline source code.

cdk destroy --profile code --region eu-central-1

To delete the cloudformation stack created run the following command:

aws cloudformation delete-stack --stack-name circular-area-calculator --profile code  --region eu-central-1

Thank you for reading this article! Please share your questions and valuable feedbacks in the comments section. If you enjoyed reading this story, clap for me by clicking the button below so other people can see this here on Medium.

I work at Sytac.io; We are a consulting company in the Netherlands, we employ around ~100 developers across the country at A-grade companies like KLM, ING, ABN-AMRO, TMG, Ahold Delhaize, and KPMG. Together with the community, we run DevJam check it out and subscribe if you want to read more stories like this one. Alternatively, look at our job offers if you are seeking a great job!

Have fun exploring the cloud services. Happy Coding and Stay safe!

--

--

Sabarish Sekar
Devjam
Writer for

Developer, making pipeline green, public cloud and serverless platforms enthusiast, secretly passionate about data engineering