Unit Test Serverless Applications The DevOps Way -Part 2

Mahdi Azarboon
6 min readJul 14, 2019

Here you can find link for the first part of this tutorial. In this part, I’m going to explain how you can create a DevOps pipeline based on AWS CodePipeline to automatically perform tests.

Pipeline

Now that application and unit tests are ready, let’s see how we can automate unit test and deployment by a CI/CD pipeline. AWS CodePipeline and its associated services make your job easy. A pipeline consists of stages and runs them sequentially. Each stage consists of one (or more) action. You do the tasks by configuring actions. You need to assign role to actions, enabling them to perform their task.

In case any stage fails, the whole pipeline stops. Each stage consists of one or more actions. We are going to define three stages in our pipeline, each having one action:

  • Source stage: retrieves latest status of your source code
  • Build stage: install dependencies, makes sure all your tests pass and returns built result (artifact) as output. Output of this stage is used input of next stage.
  • Deploy stage: receives the artifact from previous stage and deploys it by CloudFormation.

Now, let’s start building our pipeline:

  1. In later stages, you will need to link your pipeline with an existing CloudFormation stack. If you already don’t have such, you can create a new stack by deploying your application. As an example, I create the stack lambda-unit-test:

2. In later steps, you will need a role that to grant permission to your deploy action to execute its tasks. To create it, go to IAM console and create a role with following properties:

  • Trusted entity: AWS Service — CloudFormation.
  • Permissions: AWSLambdaExecute.
  • Role name: code-deploy-execution-role

Then open the role, and add following inline policy to it:

3. Go to CodePipeline console and create a new pipeline similar to Figure 8. I’ve named it as sample-pipeline.

Figure 8: create a new pipeline in CodePipeline

4. Connect your pipeline with your source provider, and your code (Figure 9). Then press Next.

Figure 9: Enabling the pipeline to access to the code, which is hosted in Github

5. Before deploying your application, you need to build it (i.e. install its dependencies). CodeBuild helps you to do that. Firstly, you need to add a Build stage to your pipeline, then you need to add a build project (action) to the stage. Press on the icon next to “Create Project”, to open it in a new window, otherwise you might get some unexpected issues (Fig 10)

Figure 10: Adds Build stage to the pipeline

6. Choose the right environment for your application (Node.js runtime, in here). If you are using an existing service role for your CodeBuild, make sure that it has the required permissions. SAM needs to upload the artifact into the s3. So, the role needs to have write permission (s3:PutObject).

We let the wizard to create a service role for us but we will need to add some policies to make it work. We’ll get back to this point later on.

CodeBuild uses buildspec.yml file which includes commands to build and to test your application (Fig 11). You can add various phases into it (more information here) Once tests pass, aws cli packages local files, upload them to s3 bucket and returns a copy of new artifact, cf-template.yaml. We return cf-template.yaml as output of Build stage, and will use it input for the next stage.

Press “Continue to CodePipeline”

Figure 11: build spec defines commands that should be executed and which artifacts should be returned
Figure 12: create a build project by specifying runtime settings, build commands and service role

7. Now your project is created and can be added to your build stage. Press “Next” to proceed to Deploy stage.

Figure 13: Create build stage and add a project to it.

8. Now you need to define deploy stage. Choose CloudFormation as “deploy provider”. For action mode choose “Create or replace a change set”. Choose an existing CloudFormation stack (from step #1). You also need to specify a change set. Change sets help you to understand how new changes can affect your stack. You can list existing changesets of your stack by this command (change stack name according to yours):

aws cloudformation list-change-sets — stack-name lambda-unit-test

Choose an available Changeset and use its ChangeSetName property. Or you can create a Changeset via UI by following this tutorial. In case you don’t see the changeset as an option, just type it in the “Change set name” field

Use role name that you created in step #2. For the Template part, note that

<inputArtifactName> is actually the “Output artifacts” of previous stage (build stage). By default name of the artifact is “BuildArtifact” (you can edit this later on). Whereas, TemplateFileName is the artifact that you have defined in the buildspec file, cf-template.yaml. So fill out the Template field with BuildArtifact::cf-template.yaml

As for the field Capabilities: make sure to choose an option; otherwise, you may get error later on. If your SAM (or CloudFormation) template includes named IAM resources, you need to choose CAPABILITY_NAMED_IAM. Otherwise, you can choose either of them (you need to choose one). We don’t have named IAM resource so we choose CAPABILITY_IAM.

Choose the role that you created from step #2. Press Next.

Figure 14: choose an existing stack and changeset, define input, and capabilities for your Deploy stage

9. Review settings and create the pipeline. Upon create, CodePipeline runs it. This can take a few minutes, and pipeline fails in the build stage:

Figure 15: our newly created pipeline fails at build stage

10. Click on “Details” to see the logs (Figure 16): CloudFormation is unable to upload the artifact to s3, access is denied. This is because our CodeBuild service role (which was automatically created) doesn’t have permission to write into s3.

Figure 16: CloudFormation is unable to upload artifact to s3

11. To edit the stage, from the left column, choose Build -> Build projects -> and specify the project. Choose “Build details” pan and you can see associated service role with this build project (Fig 17).

Click on it to get redirected to its associated IAM page. Then add following inline policy. The inline policy grants enough permissions to your CodeBuild project to write into s3 and to log the data into CloudWatch. Remember the principle of least privilege: limit the resources and grant minimum required permissions. Also make sure that you put /* after your bucket name; otherwise your operation will fail.

Figure 17: you can see service role associated with the build project

12. After implementing changes, rerun the pipeline, eventually all stages should pass, your functions should be live and working.

Figure 18: all stages and the pipeline pass

From now on, every time that a new change occurs to your code, CodePipeline runs the pipeline, perform tests and deploys your application (if all tests pass). You can add more stages to your pipeline according to your needs.

--

--