AWS Lambda + FastAPI (Serverless Deployment): Complete CI/CD Pipeline Using GitHub Actions

Azzan Amin
TheLorry Data, Tech & Product
16 min readJan 1, 2021

Let’s build a complete CI/CD workflow using GitHub Actions, FastAPI, AWS Lambda (Serverless Deployment) and AWS S3.

Photo by Manasvita S on Unsplash

Are you burdened with mundane boring and repetitive tasks that delays bringing your magical software product to production? As software/devops engineers focused on bringing our code to life, we all face the same problems where we need to perform tasks that are boring, repetitive and sometimes a total waste of time. For instance, one of them is the process of running the unit tests on various environments and then deploying our software to various delivery platform. The processes takes longer than they should and surely makes our day at work inefficient and unproductive. Have you ever wondered:

  • How can I automate all these boring repetitive testing and deployment processes?
  • Do we have any kind of method that could solve these problems?

Don't worry you’re not alone.

I thought about it all the time, till I realized that there is a simple solution to these problems. It is an amazing method in software development world called Continuous Integration and Continuous Deployment(CI/CD).

CI/CD is one of the best practices in software development as it reduces the repetitive process of unit testing and the deployment of the software.

This practice undeniably will help the developers in efficiently automating all the steps required for running the automated tests on the server. Plus, it can continuously deploy the application to the deployment platform once it has passed the automated tests. And yes, it is fully automated.

The good news is, all of these processes can be achieved by using a new feature on Github called GitHub Actions!

In this article, we’ll demonstrate you a simple walkthrough on building a complete CI/CD workflow of FastAPI using GitHub Actions and we’ll deploy the API to AWS Lambda.

Let’s Dive in!

Table of Contents:

  1. Create a GitHub Repository
  2. Clone your repository to local machine
  3. Setup Virtual Environment
  4. Install the Required Dependencies
  5. Run the API in Local Machine
  6. Run Unit Test in Local Machine
  7. Update requirements.txt
  8. Create GitHub Actions Workflow Directory
  9. The Components of GitHub Actions
  10. Continuous Integration (CI): Build Automated Test
  11. Configuring GitHub Secrets, Amazon S3 and AWS Lambda
  12. Continuous Deployment (CD): Deploy Lambda
  13. Running the GitHub workflow

1. Create a GitHub Repository

Create a new repository in GitHub for the project.

Setup repository in GitHub

In this part, assign a relevant repository name and you’ll also need to add a .gitignore file. You can select the .gitignore template and since we are using Python (FastAPI), go ahead select the Python .gitignore template.

2. Clone your repository to your local machine

In order to clone the your repository, you’ll need to have the link of your GitHub repository.

Clone GitHub Repository

From your repository page on GitHub, click the green button labeled Code, and in the “Clone with HTTPs” section, copy the URL for your repository. Once you have copied the link, you can now clone the project to your local machine.

Open your bash shell and you can change your current working directory to the location that you want to clone your repository. To clone your repository is simple, you can use:

git clone https://github.com/URL-TO-REPO-HERE

Nice! you have successfully cloned your github repository.

3. Setup Virtual Environment

To follow these steps below, you are required to install Python in your machine. If you do not have it install yet, please check this out: Python 3 Installation & Setup Guide — Real Python

Once you have Python 3 installed, create a virtual environment inside your local project directory. Open a terminal on your local machine and run the following commands:

  1. Install the virtualenv package

You can install the package by using pip.

pip install virtualenv

2. Create the virtual environment

To create a virtual environment, you must specify a path for that. For example to create one in the local directory called venv, type the following:

virtualenv venv

3. Activate the virtual environment

You can activate the python virtualenv by running the following command:

  • Mac OS / Linux
source venv/bin/activate
  • Windows
venv\Scripts\activate

4. Install the Required Dependencies

If you want to follow along the walkthrough, this GitHub repository contains the code for this project. Please feel free to use and follow the codes there.

To install all the libraries for the project, make sure you are in the root of the project and run the following command:

pip install -r requirements.txt

5. Run the API in Local Machine

The sample project we created in this walkthrough tutorial is based on FastAPI. FastAPI is a modern, fast (high-performance), web framework for building APIs with Python 3.6+ based on standard Python type hints. If, you’re interested in learning more about this cool framework, you can read this article written by its awesome author, Sebastián Ramírez. Take it from us, FastAPI 🤩 is the most efficient way of creating api’s in python.

In order to run the FastAPI app, we can use Uvicorn ASGI server to start the app in our terminal. If you downloaded or cloned the code from GitHub (as mentioned in Point 4), you can follow the commands below.

Alright, let’s change our current working directory to app folder. Then, start our uvicorn server (hypercorn is another alternative):

cd app && uvicorn main:app --reload

Once we run it, we should be able to access the Swagger UI in our browser. Visit the link:

http://localhost:8000/docs
Simple FastAPI app CI/CD workflow using GitHub Actions

6. Run Unit Tests in Local Machine

In the project, we have few unit tests that will check the endpoints whether they are working fine or not. The unit tests file is inside app/tests folder named test_main.py. To know more about why unit tests are important and how to write them, you may follow our other article here.

from fastapi.testclient import TestClient
from main import app
client = TestClient(app)
def test_main_resource():
response_auth = client.get("/")
assert response_auth.status_code == 200
def test_child_resource():
response_auth = client.get("/api/v1/test")
assert response_auth.status_code == 200

To run the unit tests, open the terminal and type:

pytest

Output in console:

Great, we have passed all the unit tests! Please note that the Pytest library is required to run the test.

7. Update requirements.txt

Incase if you have installed any new libraries or packages to the project, its a good practice to update your requirements.txt by typing the following freeze command:

pip freeze > requirements.txt

This is just to ensure we have the latest updated packages inside our requirements.txt file so that we can avoid complications when running the projects on the server or any other machine.

8. Create GitHub Actions Workflow Directory

To create the CI/CD workflow in GitHub Actions, we need to create a .yml file in our repository. From our application root, create a folder named .github/workflows that will contain the GitHub action workflows.

Then, create main.yml (just an example, we can put any name that you like for the .yml file) inside the created folder as this file will contain all the instructions for our automated tests and deployment process to AWS Lambda through our code on GitHub repository. You can use the code below in the terminal as the guide to achieve this process.

cd path/to/root_repo
mkdir .github/workflows
touch .github/workflows/main.yml

Cool! Now we have the directory for GitHub Actions workflow.

9. Understanding the GitHub Actions Workflow

There are six main components in GitHub Actions:

  • Workflow — The automated procedure that we add to our repository and can be triggered or scheduled by an event.
  • Events — It is a specific activity that will trigger a workflow. For example, an event triggers when someone pushes a commit to a repository or a pull request is created.
  • Jobs —Series of steps that execute on the same runner. It can run parallelly or sequentially depends on our workflow objectives.
  • Steps — It is an individual task that will run commands in a job.
  • Actions — An action is a set of standalone commands that gets executed on the runner and it is combined into steps to create a job.
  • Runners — A server that hosted virtual operating systems by GitHub or your own (self-hosted) that can run commands to perform a build process. Hosted runners by GitHub are based on Microsoft Windows, Ubuntu Linux and macOS.

The workflow that we are going to build will consist of two main jobs:

  • Continuous Integration (CI)

The CI will run the automated test, package our FastAPI into Lambda and upload the lambda artifact in the GitHub server in order to enable the other jobs (in our case, the Continuous Deployment Job) to use it.

  • Continuous Deployment (CD)

This job will only be executed when the CI job is successfully completed. CD job needs to be dependent on the status of the CI build job in order to make sure that we can only deploy the application once we have passed the CI part. Basically, this job will download the lambda artifact that has been uploaded during the CI job and deploy it to AWS Lambda by linking it with AWS S3.

So our workflow will look like this in the .yml file.

name: CI/CD Pipeline

on:
push:
branches: [ main ]
jobs:

continuous-integration:
....

continuous-deployment:
.....

For the detailing part of the steps and commands inside each job, please stay along with me okay!

Alright, let’s get our hands dirty by creating the workflow using GitHub Actions.

10. Continuous Integration (CI): Build Automated Test and Package Lambda

Here is the complete CI workflow in our main.yml file.

name: CI/CD Pipeline

on:
push:
branches: [ main ]


jobs:

continuous-integration:
runs-on: ubuntu-latest

steps:
# Step 1
- uses: actions/checkout@v2

# Step 2
- name: Set up Python
uses: actions/setup-python@v2
with:
python-version: 3.7
architecture: x64
# Step 3
- name: Install Python Virtual ENV
run: pip3 install virtualenv
# Step 4
- name: Setup Virtual env
uses: actions/cache@v2
id: cache-venv
with:
path: venv
key: ${{ runner.os }}-venv-${{ hashFiles('**/requirements*.txt') }}
restore-keys: |
${{ runner.os }}-venv-
# Step 5
- name: Activate and Install Depencies into Virtual env
run: python -m venv venv && source venv/bin/activate &&
pip3 install -r requirements.txt
if: steps.cache-venv.outputs.cache-hit != 'true'
# Step 6
- name: Activate venv and Run Test
run: . venv/bin/activate && pytest

# Step 7
- name: Create Zipfile archive of Dependencies
run: |
cd ./venv/lib/python3.7/site-packages
zip -r9 ../../../../api.zip .

# Step 8
- name: Add App to Zip file
run: cd ./app && zip -g ../api.zip -r .

# Step 9
- name: Upload zip file artifact
uses: actions/upload-artifact@v2
with:
name: api
path: api.zip

Let us break it down and have a look at each part of the workflow:

  • The name assigned to this workflow is CI/CD Pipeline
  • The workflow will be triggered when commit codes pushed to the main branch in the repository.
  • The job defined in this workflow is continuous-integration.
  • The runner used in the workflow is ubuntu-latest (Ubuntu Linux Operating Systems)

These are the sequential series of steps defined in the CI workflow:

  • Step 1: Perform actions/checkout@v2 that will checkout to our repository and downloads it to the runner.
  • Step 2: Setup python 3.7 by using actions - actions/setup-python@v2
  • Step 3: Install Python Virtual Environment (virtualenv) package
  • Step 4: Caching Dependencies using actions/cache@v2. This step will increase the performance of the project workflow that consists of many large dependencies by efficiently reduce the time required for downloading. For further explanation, please read the GitHub Actions Documentations.
  • Step 5: Activate the virtualenv and install all the dependencies that consist insiderequirements.txt
  • Step 6: Run Unit Tests. By the way, we need to activate again the virtualenv before running the test as GitHub Actions doesn’t preserve the environment.
  • Step 7: Package our Lambda by zipping all the dependencies in the venv site-packages and place it in our root directory. The zip file is named api.zip
  • Step 8: Add the contents of our app folder into api.zip
  • Step 9: Upload the api.zip to GitHub server as an artifact using actions/upload-artifact@v2 . This will enable the next job to retrieve back the artifact file for the deployment of our lambda package which is our api.zip file.

Phew! We have completed our CI workflow using GitHub Actions. Let’s make some minor changes in our code and try to push our code to the main branch. Make sure you’re using the CI code above in the main.yml file to test the CI workflow. This is how the result will look like when we have trigger the workflow by our code push event.

CI workflow GitHub Actions Result

You will see this output by going inside the Actions tab in your repository.

GitHub Actions Tab

Continous Integration (CI) Done! ✔️

Let’s quickly move onto the Continuous Deployment part now. We are almost there!

11. Configuring GitHub Secrets, Amazon S3 and AWS Lambda

Before we create the CD workflow for our project, FIVE things that need to be done:

  • Add GitHub Secrets
  • Create S3 Bucket
  • Create a Lambda Function
  • Implement Mangum Handler
  • Update Lambda handler
  1. Add GitHub Secrets

GitHub Secrets is used to store confidential information. In our case, we need to store our AWS_SECRET_ACCESS_KEY_ID, AWS_SECRET_ACCESS_KEY, and AWS_DEFAULT_REGION.

To add the secrets, click on Settings from the repository page, then select secrets from the left menu-list. We will see a button named New Repository Secrets on the top right and click on that Button. This is the example output in GitHub after we have added the secrets.

GitHub Secrets

After we have added the secrets in our GitHub repository, we can now use the secrets variable in our main.yml workflow file.

2. Create Amazon S3 Bucket

Go to AWS Management Console and log in using our AWS Account. Then, proceed to Amazon S3 in the console and Click on Create bucket.

AWS S3

Just give the bucket any name and click on Create bucket. The diagram below shows how to create the S3 bucket:

Create S3 Bucket Demo

Done, you have successfully created the S3 bucket. Let’s create our Lambda function then.

3. Create a Lambda Function

Go to AWS Management Console and log in using our AWS Account. Then, proceed to AWS Lambda in the console and click on the Create function.

AWS Lambda

Then, choose author from scratch, select python 3.7 as runtime. You will need to choose or create an execution role before creating the function. Once it is done, click on the Create function.

Create Lambda Function

4. Implement Mangum Handler

In order to enable our FastAPI to be deployed as a Lambda function in AWS, we will need to use the Mangum library to wrap our API. Why we need Mangum? It is because:

  • Mangum will manage the responses back from the Lambda function to the API Gateway and it acts as an adapter to handle the API Gateway routes requests to our Lambda function.

To apply Mangum in our code, we just need to add the following code:

from mangum import Mangum
...
handler = Mangum(app=app)

In our main.py , this is how we implement the Mangum which is the Amazon lambda handler:

from fastapi import FastAPI
from mangum import Mangum # <---------- import Mangum library

from api.v1.api import router as api_router

app = FastAPI(title='Serverless Lambda FastAPI')

app.include_router(api_router, prefix="/api/v1")


@app.get("/", tags=["Endpoint Test"])
def main_endpoint_test():
return {"message": "Welcome CI/CD Pipeline with GitHub Actions!"}

handler = Mangum(app=app) # <----------- wrap the API with Mangum

5. Update Lambda Handler

For this part, we need to update the lambda handler in AWS Lambda Runtime settings.

By default we will see the handler is set to lambda_function.lambda_handler and we need to update this handler value to match with our FastAPI handler.

AWS Lambda Runtime Settings

Under Runtime Settings click on Edit. Then, update the handler to main.handler (it means that we retrieve the handler variable which is the Mangum handler inside main.py)

Update Lambda handler

12. Continuous Deployment (CD): Deploy Lambda

Let’s continue with our GitHub Actions workflow. So, here is the complete CD workflow in our main.yml file.

...continuous-deployment:
runs-on: ubuntu-latest
needs: [continuous-integration]
if: github.ref == 'refs/heads/main'
steps:
# Step 1
- name: Install AWS CLI
uses: unfor19/install-aws-cli-action@v1
with:
version: 1
env:
AWS_ACCESS_KEY_ID: ${{ secrets.AWS_ACCESS_KEY_ID }}
AWS_SECRET_ACCESS_KEY: ${{ secrets.AWS_SECRET_ACCESS_KEY }}
AWS_DEFAULT_REGION: ${{ secrets.AWS_DEFAULT_REGION }}
# Step 2
- name: Download Lambda api.zip
uses: actions/download-artifact@v2
with:
name: api
# Step 3
- name: Upload to S3
run: aws s3 cp api.zip s3://<YOUR_S3_BUCKET_NAME>/api.zip
env:
AWS_ACCESS_KEY_ID: ${{ secrets.AWS_ACCESS_KEY_ID }}
AWS_SECRET_ACCESS_KEY: ${{ secrets.AWS_SECRET_ACCESS_KEY }}
AWS_DEFAULT_REGION: ${{ secrets.AWS_DEFAULT_REGION }}
# Step 4
- name: Deploy new Lambda
run: aws lambda update-function-code --function-name <YOUR_LAMBDA_FUNCTION_NAME> --s3-bucket <YOUR_S3_BUCKET_NAME> --s3-key api.zip
env:
AWS_ACCESS_KEY_ID: ${{ secrets.AWS_ACCESS_KEY_ID }}
AWS_SECRET_ACCESS_KEY: ${{ secrets.AWS_SECRET_ACCESS_KEY }}
AWS_DEFAULT_REGION: ${{ secrets.AWS_DEFAULT_REGION }}

Here is the explanation:

  • The job defined in this workflow is continuous-deployment.
  • The runner used in the workflow is ubuntu-latest (Ubuntu Linux Operating Systems)
  • This job will only run when the continuous-integration build is succeeded. This process can be achieved by using the command needs:[continuous-integration] after the runner has been defined.
  • Check if current branch is main by using the command if: github.ref == ‘refs/heads/main’

These are the sequential series of steps defined in the CD workflow:

  • Step 1: Install AWS CLI in the runner using unfor19/install-aws-cli-action@v1
  • Step 2: Download api.zip artifact from GitHub server using actions/download-artifact@v2 . The artifact name defines in this step must the same as the name used during the step of uploading the artifact. In our case, the artifact name is api
  • Step 3: Upload api.zip to our Amazon S3 bucket name that we have defined.
  • Step 4: Deploy the api.zip as uploaded in S3 to our AWS Lambda function name that we have defined.

P/s: For each of the steps that use AWS CLI, we need to include the environment variables of AWS secret keys inside it. The AWS secret keys will be the GitHub Secrets that we store earlier. This is how it can be done in GitHub Actions .yml file:

env:
AWS_ACCESS_KEY_ID: ${{ secrets.AWS_ACCESS_KEY_ID }}
AWS_SECRET_ACCESS_KEY: ${{ secrets.AWS_SECRET_ACCESS_KEY }}
AWS_DEFAULT_REGION: ${{ secrets.AWS_DEFAULT_REGION }}

So, this is how the build succeeded result will look like for the CD workflow when we have triggered it by any push event.

CD workflow GitHub Actions

Continuous Deployment (CD) workflow is done! ✔️

13. Running the GitHub Actions CI/CD workflow

Here is the complete CI/CD workflow in our main.yml file:

name: CI/CD Pipeline

on:
push:
branches: [ main ]


jobs:

continuous-integration:
runs-on: ubuntu-latest

steps:
- uses: actions/checkout@v2

- name: Set up Python all python version
uses: actions/setup-python@v2
with:
python-version: 3.7
architecture: x64

- name: Install Python Virtual ENV
run: pip3 install virtualenv

- name: Setup Virtual env
uses: actions/cache@v2
id: cache-venv
with:
path: venv
key: ${{ runner.os }}-venv-${{ hashFiles('**/requirements*.txt') }}
restore-keys: |
${{ runner.os }}-venv-

- name: Activate and Install Depencies into Virtual env
run: python -m venv venv && source venv/bin/activate &&
pip3 install -r requirements.txt
if: steps.cache-venv.outputs.cache-hit != 'true'


# Install all the app dependencies
- name: Install dependencies
run: pip3 install -r requirements.txt


# Build the app and run tests
- name: Build and Run Test
run: . venv/bin/activate && pytest

- name: Create Zipfile archive of Dependencies
run: |
cd ./venv/lib/python3.7/site-packages
zip -r9 ../../../../api.zip .

- name: Add App to Zipfile
run: cd ./app && zip -g ../api.zip -r .

- name: Upload zip file artifact
uses: actions/upload-artifact@v2
with:
name: api
path: api.zip


continuous-deployment:
runs-on: ubuntu-latest
needs: [continuous-integration]
if: github.ref == 'refs/heads/main'
steps:

- name: Install AWS CLI
uses: unfor19/install-aws-cli-action@v1
with:
version: 1
env:
AWS_ACCESS_KEY_ID: ${{ secrets.AWS_ACCESS_KEY_ID }}
AWS_SECRET_ACCESS_KEY: ${{ secrets.AWS_SECRET_ACCESS_KEY }}
AWS_DEFAULT_REGION: ${{ secrets.AWS_DEFAULT_REGION }}

- name: Download Lambda api.zip
uses: actions/download-artifact@v2
with:
name: api

- name: Upload to S3
run: aws s3 cp api.zip s3://<YOUR_S3_BUCKET_NAME>/api.zip
env:
AWS_ACCESS_KEY_ID: ${{ secrets.AWS_ACCESS_KEY_ID }}
AWS_SECRET_ACCESS_KEY: ${{ secrets.AWS_SECRET_ACCESS_KEY }}
AWS_DEFAULT_REGION: ${{ secrets.AWS_DEFAULT_REGION }}

- name: Deploy new Lambda
run: aws lambda update-function-code --function-name <YOUR_LAMBDA_FUNCTION_NAME> --s3-bucket <YOUR_S3_BUCKET_NAME> --s3-key api.zip
env:
AWS_ACCESS_KEY_ID: ${{ secrets.AWS_ACCESS_KEY_ID }}
AWS_SECRET_ACCESS_KEY: ${{ secrets.AWS_SECRET_ACCESS_KEY }}
AWS_DEFAULT_REGION: ${{ secrets.AWS_DEFAULT_REGION }}

Now, let’s push some changes to our main branch to see the result of our CI/CD workflow in GitHub Actions. We should see something like this in our GitHub Actions for the latest build workflow result.

CI/CD Detailed Build Succeeded

Awesome! All jobs succeeded. All jobs succeeded. This means we have completed the full workflow of our CI/CD Pipeline for FastAPI to AWS Lambda.

This is extremely powerful. We can now deploy our API, bug fixes and any new feature requests to production in minutes rather than days, without any human dependency. Congratulations!

Summary

Implementing a CI/CD Pipeline in our API development projects is incredibly powerful. IT helps us increase our productivity and confidence without having to spend a lot of time handling the tasks and commands to run these mundane processes manually.

In this article, we discussed the complete CI/CD workflow starting from creating a GitHub Repository until building a complete CI/CD workflow using GitHub Actions. Along the way, we also learned about running ASGI APIs, performing unit tests in our local machine, packaging our FastAPI to Lambda using Mangum, and configuring AWS Lambda and AWS S3 Services for our deployment.

We hope this article will help you build your own customized CI/CD Pipelines for your own awesome FastAPI projects.

Peace! ✌️

--

--