ToDo app API with Serverless Framework, Terraform, AWS CDK

Andrew Zaikin
firstlineoutsourcing
6 min readJul 26, 2021

In a Serverless world, there are several available ways to create a backend and organize the code. So I decided to show you how it can be done with tools from the title to provide you the ability to make parallels.

The task

It’s a simple backend for AWS:

  • to create a todo
  • get all from a database
  • get one by ID
  • update a status
  • delete one.

We will have 5 Lambda functions, 5 httpApi endpoints in API Gateway, and one DynamoDB table. Easy stuff 🤞.

I’ll use TypeScript with Webpack because it’s strict typing, error handling on transpile stage, interfaces, flexible building configuration, and I use it in my regular work.

Serverless framework

SLS has a wide community and was made especially for such tasks and other architecture solutions with a serverless approach.

We’ll use a template for TypeScript and SLS CLI to generate a new project

sls create --template aws-nodejs-typescript

You can find here the repo with all code 👈

SLS used only the YAML file with configurations, but from last year, it supports TS files also.

The main configuration is in the serverless.ts file, where we need to describe the service, policy, plugins, and general settings.

Lifehack: TS allows importing our functions and putting them to the main configuration.

For this project, I used the skeleton, which we in First Line Outsourcing use daily.

We need to declare a DynamoDB table and all functions for our ToDo API.

The handler points to a function exported from the handler.ts file in the src/functions/todo folder.

SLS template uses the Middy as a middleware for all functions. Middy is a middleware engine that allows you to focus on the strict business logic of your Lambda and then attach additional common elements like authentication, authorization, validation, serialization, etc., in a modular and reusable way by decorating the main business logic.

I use the DocumentClient from AWS SDK to work with DynamoDB instead of Dynamoose.js because it allows working with a table much more flexibly.

Lifehack: Serverless framework has a lot of plugins. In this project we use the webpack plugin to build the code, we use serverless-offline to run our service locally and serverless-prune plugin to remove old versions of functions.

To deploy our service we need one line in terminal

sls deploy --stage dev --config serverless.ts

As a result, we will have a list of resources and URLs for all endpoints.

Service Information
service: todo-sls
stage: dev
region: us-east-1
stack: todo-sls-dev
resources: 37
api keys:
None
endpoints:
POST - https://{service_id}.execute-api.us-east-1.amazonaws.com/todo
GET - https://{service_id}.execute-api.us-east-1.amazonaws.com/todo
GET - https://{service_id}.execute-api.us-east-1.amazonaws.com/todo/{id}
PUT - https://{service_id}.execute-api.us-east-1.amazonaws.com/todo/{id}
DELETE - https://{service_id}.execute-api.us-east-1.amazonaws.com/todo/{id}
functions:
createTodo: todo-sls-dev-createTodo
getTodos: todo-sls-dev-getTodos
getTodoById: todo-sls-dev-getTodoById
updateTodo: todo-sls-dev-updateTodo
deleteTodo: todo-sls-dev-deleteTodo
layers:
None

Lifehack: it’s possible to deploy one function separately with

sls deploy function -f funcName -s dev

That’s it! Use Postman with URLs from the output, and it should work.

Thoughts: I suppose the Serverless framework is useful for various projects and can handle complex serverless architectures with many resources and triggers. Plugins system and the community help to handle any blocker. At the same time, this framework can’t handle complex infrastructures, but it was not a goal of its creation.

Terraform

Terraform is an infrastructure as code tool that helps to manage all our resources and set them up in AWS. From my point of view, the key features of this tool in a serverless context are flexibility in work with all resources and the ability to update just affected parts of the infrastructure.

You can find here the repo with all code 👈

We need to install Terraform CLI from here and create the main.tf file with Lambda functions, log groups, role, and role policy. Here is a trimmed example.

You can notice variables in this example like

var.table_name

We can use variables.tf file to describe variables with default values, and then Terraform will use either default values or values from CLI parameters.

We need a file for the DynamoDB table. Let’s call it dynamo-db.tf

The main party is in api_gateway.tf file where we need to describe all resources for API endpoints.

What do we need:

  • aws_apigatewayv2_api to create our API
  • aws_cloudwatch_log_group to know what happens in requests
  • aws_apigatewayv2_stage to deploy our API and set settings
  • aws_lambda_permission + aws_apigatewayv2_integration + aws_apigatewayv2_route for each route where we attach a lambda as integration

Important: Make sure you use a wildcard for source_arn to provide access to lambda from any stage, or you will face the 500 status in response.

source_arn = "${aws_apigatewayv2_api.todo_terraform.execution_arn}/*/*/*"

Lifehack: In the stage resource, you can flexibly set up a format of logs.

Alright, let’s use Terraform CLI to init our backend and apply all described resources to it.

terraform init
terraform apply

CLI will inform you what it’s going to do and ask for your confirmation.

There is one issue here. We forgot about our TS files. I’m going to use files from the SLS project and build them with webpack. It will require some adjustments in the webpack conf file. SLS webpack plugin helps build a js file, copy required dependencies, and zip it to a bundle. We need to upload this zip file to our bucket in S3 for all lambda functions.

Important: when you want to update your code, you need to reattach the file to lambda functions to update them. I use a random hash as the name of the sub-folder.

I suggest using a bash file for all the commands.

When it’s done, we can use Postman to check it. You can find the stage URL in API Gateway or return it in your terminal with the output in Terraform.

Thoughts: It’s a great tool to manage wide infrastructures, but it takes much more time to do something not so complex.

AWS CDK

AWS Cloud Development Kit is the framework to define our cloud application resources using familiar programming languages. Again, the authors made a perfect work! It’s a straightforward and useful tool to build architectures for AWS with, for example, TypeScript. In addition, there are ways how devs from Terraform or Kubernetes can easily work with it.

You can find here the repo with all code 👈

We need to install CDK with npm to start working on the project

npm install -g aws-cdk

Then we can use CLI to scaffold a project structure

cdk init --language typescript

In the docs, you can find many examples for many languages and explanations of how it should work. Unfortunately, at the time of writing, I couldn’t find an example of working with the httpApi. I found Borislav Hadzhiev’s blog, and he has many useful, practical articles.

Important: All AWS Construct Library modules used in your project must be the same version.

CDK project folder structure

We need todo.cdk.ts just to init TodoCdkStack, and everything lives in todo.cdk-stack.ts.

It’s super easy:

  • define a DynamoDB table
  • define lambdas
  • provide access to read\write DB items to lambdas
  • define httpApi
  • define routes with integrations

Important: Docker is required.

For the first run, you need to bootstrap your account to work with CDK

cdk bootstrap aws://{accountId}/{region}

and then deploy

cdk deploy --profile flo

Lifehack: CfnOutput helps to see selected results in the terminal, like API URL.

Thoughts: I think CDK is awesome and clear for understanding. The cons are it is under active development right now, and there is a risk of facing issues without solutions. For example, httpApi is in experimental status now, and the community is not wide enough to get information quickly.

Сonclusion

For each task, should select a proper tool that helps you to solve it fast enough. Here we have three tools for different tasks but in the same context, and let’s make sure we did a good business analysis first to determine which one we will choose 😉.

If I missed something, let me know in the comments 🙂. Let’s share the experience and make each other more experienced 🤝.

Andrey Zaikin

Founder at
First Line Outsourcing

Move your business forward

Web and mobile development that help companies reach their goals

Instagram

LinkedIn

--

--

Andrew Zaikin
firstlineoutsourcing

Founder & CEO at First Line Outsourcing https://flo.team | Mobile, Web and SaaS Development for Media Production, eGames & Tech | Adobe Technology Partner