DevOps with AWS Lambda: How to define CI/CD pipeline to deploy functions

shotin
6 min readJun 1, 2020

--

Note: There is a newer version of this article.

I’ve tried to deploy AWS Lambda functions of Python (including external library such as numpy, pandas etc.) using CodeCommit, CodeBuild, and CloudFormation and orchestrated by CodePipeline.

This makes it very easy to deploy Lambda functions but a bit hard to set up for me, so I’m going to write down.

Proceed with the following steps:
1. Create a repository on CodeCommit
2. CLONE repository and PUSH source code, build template, and deploy template files to the repository
3. Create a S3 bucket and CloudFormation role before create a pipeline
4. Create pipeline in CodePipeline (also create)
5. Source code is deployed by CodePipeline automatically

1. Create repository on CodeCommit

First of all, create a repository on CodeCommit, but this is easy.

You can go “CodeCommit” from the console and click on “Create repository”. Fill in the repository name and click on “Create”.

You get a repository on CodeCommit.

2. CLONE repository and PUSH source code, build template, and deploy template files to repository

Next, you can clone the repository but you need to complete some steps before this.
1. Install and configure AWS CLI and Git. Let me skip this procedure so google it.
2. Generate HTTPS git credentials of your IAM user. You can do this from the IAM user page on the console. You can find “HTTPS Git ~” at the bottom of “Security credentials” tab and click on “Generate credentials”. And then, user id, password and download button are displayed on console, so you download the credential csv file. (You must not share this file with anyone.)

Preparation is completed as above. You can clone command from the repository page on the console, so copy it and execute it in the directory where you want on your local computer.

You must be required user id and password for git if you have never configured before, so you enter these and finally get a local repository. (Of course, it’s empty.)

You create the files and folder as below:

TestRepository
├── buildspec.yml
├── src
│ ├── Index.py
│ └── requirements.txt
└── template.yml

And you write codes in files as below:

  • Index.py
    - This is the lambda function.
    - Import is not necessary, just to confirm if the library can be used.
import numpy as np
import pandas as pd
def lambda_handler(event, context):
return 'Success!!'
  • requirements.txt
    - Fill in the library name you want to use.
numpy
pandas
  • buildspec.yml
    - This is a definition for build such as runtime version, installing an external module, output S3 bucket for complied source code etc.
    - I set the file extension “template.YAML” in buildspec.yml while the file name is actually “template.YML”. This causes build error again and again, but it take so much time to figure out this… so BE CAREFUL to configure this file!
version: 0.2
phases:
install:
runtime-versions:
python: 3.7
build:
commands:
- sam build
- export BUCKET=test-pipeline-output-bucket
- sam package --s3-bucket $BUCKET --output-template-file outputtemplate.yml
artifacts:
type: zip
files:
- template.yml
- outputtemplate.yml
  • template.yml
    - This is the SAM template for CodeBuild. You define handler in a lambda function, source directory etc.
    - CodeBuild will compile source code using this template and output new template file for deploy as CloudFormation stack. The new template file name is defined in buildspec.yml as “outputtemplate.yml”.
AWSTemplateFormatVersion: '2010-09-09'
Transform: AWS::Serverless-2016-10-31
Description: Python file including external library
Resources:
SampleFunction:
Type: AWS::Serverless::Function
Properties:
Handler: Index.lambda_handler
Runtime: python3.7
FunctionName: SamplePython
CodeUri: ./src #Directory the source file is

After you create files as above, you can push these to the repository by using the command on the local repository.

git add .
git commit -m "first commit"
git push

And then, you can find the files pushed to CodeCommit repository on the console.

3. Create a S3 bucket and CloudFormation role before create pipeline

Create an S3 bucket you named in buildspec.yml. You don’t need any configuration, just create as default.

Additionally, you have to create an IAM role for CloudFormation to access any resources. Attach policies as below:

  • AWSLambdaExecute
  • InlinePolicy
{
"Statement": [
{
"Action": [
"apigateway:*",
"codedeploy:*",
"lambda:*",
"cloudformation:CreateChangeSet",
"iam:GetRole",
"iam:CreateRole",
"iam:DeleteRole",
"iam:PutRolePolicy",
"iam:AttachRolePolicy",
"iam:DeleteRolePolicy",
"iam:DetachRolePolicy",
"iam:PassRole",
"s3:GetObject",
"s3:GetObjectVersion",
"s3:GetBucketVersioning"
],
"Resource": "*",
"Effect": "Allow"
}
],
"Version": "2012-10-17"
}

4. Create pipeline in CodePipeline

Next, create a pipeline on CodePipeline.

Choose the repository created step 1 in CodeCommit as a source provider.

Choose CodeBuild as build provider and click on “Create project”.

  • OS — Ubuntu
  • Runtime — Standard
  • Image — aws/codebuild/standard:2.0
  • Service Role — Whatever you want (use later)
  • Buildspec name — buildspec.yml (same as repository)

Choose CloudFormation as deploy provider.

  • Action mode — Create or replace a change set
  • Stack Name & Change set name — Whatever you want (These stack don’t exist yet, but it’s fine)
  • Template — BuildArtifact / outputtemplate.yml (This template file is delivered from CodeBuild as I mentioned in step 2)
  • Capabilities — CAPABILITY_IAM, CAPABILITY_AUTO_EXPAND
  • Role name — Choose role created in step 3

Finally, you’ve created a pipeline. The release starts automatically (and failed in the build step), but it doesn’t matter.

You go to IAM role created build project step and attach “S3 Full Access” policy. And then, build project can output the complied file to S3 as configured in buildspec.yml. After this, you click on “Release change” in CodePipeline. The pipeline succeed.

Furthermore, edit your pipeline and deploy the stage. Add action group (NOT action), choose CloudFormation as deploy provider.

  • Input artifacts — BuildArtifact
  • Action mode — Execute a change set
  • Stack Name & Change set name — Same as create change set

Deploy stage becomes as below.
I wrongly added this change set as “Add action” beside deploy. This makes the deploy stage in the order of ExecuteChangeSet -> Deploy and causes an error. It takes time to figure out, so BE AWARE of adding as “Add action group”.
(and don’t forget to save this change)

You click on “Release change” again. Every step in the pipeline should be succeeded.

5. Source code is deployed by CodePipeline automatically

Go to the Lambda service page, you can find the Lambda function as you named on template.yml.

Execute on trial, the result should be like this. If you failed to import external library, it occurs import error:

Finally, You have set up all pipeline!! From now on, whenever you push code to CodeCommit, source code is automatically deployed to Lambda.

--

--