In the last part, we looked at how we can leverage Serverless stages to deploy our lambdas to different environments and manage the variables for each environment accordingly.

However, as mentioned, saving config files on local systems is not a very good practice. Neither is downloading them manually from a place like S3 every time you need to deploy.

The solution is to automate the entire process like this:

  1. Create a bucket on S3 with minimal permissions.
  2. Upload the config files for all the environments onto this bucket.
  3. Create a bash script which downloads the file from S3, uses it to deploy the lambda functions and then deletes it!

Simple, right? Lets get started!

For base code, Clone this repository from the last part and remove/replace the region and role in serverless.yml as per need.

Setting up the S3 Bucket

We’ll be using the AWS CLI for all our interaction with S3 so make sure you have that set up locally.

First, we need to create a bucket with the command:

aws s3 mb s3://[ANY_UNIQUE BUCKET_NAME]

My bucket’s name is “serverless-medium-creds”. Make sure that the bucket name that you’re using is unique! Replace the bucket name of your choosing in all the successive commands with the one I am using.

Next, we need to upload our config files for dev and staging environment to this bucket:

> aws s3 cp env.dev.json s3://serverless-medium-creds> aws s3 cp env.staging.json s3://serverless-medium-creds

The above commands should work as is if you’re already in the folder where these config files exist. If not, then add the complete path to these files as well.

To verify that our files have been uploaded successfully, run:

> aws s3 ls s3://serverless-medium-creds
2020-05-26 14:22:11 137 env.dev.json
2020-05-26 14:23:09 149 env.staging.json

Cool! we can see our files in the S3 Bucket!

Creating the Bash Script

The next step is to create the bash script in our Project folder.

create a script file called deploy_lambdas.sh and make it executable:

> touch deploy_lambdas.sh
> chmod +x deploy_lambdas.sh

Delete all the config files from the project folder. We dont need them anymore. Our project structure should now look like this:

- .gitignore
- deploy_lambdas.sh
- handler.js
- serverless.yml

add the following to your deploy_lambdas.sh file:

#!/bin/sh
env=$1

#Safety Checks
if [ -z "$env" ] || ([ "$env" != "dev" ] && [ "$env" != "staging" ])
then
echo "Please provide arguments: Usage: ./lambda-deploy.sh [ENV_TO_DEPLOY]"
echo "Supported environments: dev / staging"
exit 1
fi

echo "Deployment started for environment: " $env

#Creating necessary variables
fileName="env.${env}.json"
sourceLocation="s3://serverless-medium-creds/${fileName}"

# Downloading the config file from S3
echo "downloading file from s3"
aws s3 cp $sourceLocation $fileName

#Deploying the lambdas via serverless
echo "Deploying Lambdas"
serverless deploy --stage $env
echo "Lambdas deployed"

#Once done, we should delete the config file from local machine
echo "deleting config file from local"
rm $fileName


echo "Deployment complete for environment: " $env
exit 0

Lets understand the script first before using it.

  1. The script takes in the environment to which you need to deploy as an argument. Example usage in our case: ./deploy_lambdas.sh dev .
  2. This argument is first validated. It is checked whether the argument is provided or not or whether a valid argument is provided or not. In our case, since we’re supporting “dev” and “staging”, that’s the validation check that I have added.
  3. The next step prepares the necessary variables. fileName is the name of our config file and sourceLocation contains the absolute link of the file on the S3 bucket. Be sure to replace the bucket name with the one that you’re using!
  4. The command aws s3 cp $sourceLocation $fileName downloads the file from the bucket onto our local system.
  5. serverless deploy — stage $env deploys our lambda functions to the environment that we provided to our bash script.
  6. Once the deployment is complete, we delete the local file using rm $fileName .

and that’s it!

Use the script by running ./deploy_lambdas.sh dev or ./deploy_lambdas.sh staging or any other environment of your choice!

You can use this script anywhere you handle your deployments from — From your local machine or any CI / CD Tools like Jenkins or AWS Code Pipelines.

In the next part, we’ll look at how we can package our lambda functions individually with the Serverless Framework!

PS: The Source Code for this part has been uploaded on Github for reference.

This article is a part of my 5 Article Series on the Serverless Framework!

Part 1: Serverless: Managing environment variables efficiently with stages

Part 2: Serverless: Managing config for different environments with S3 and Bash Scripts

Part 3: Serverless: Creating light and lean function packages

Part 4: Serverless: Breaking a large serverless.yml into manageable chunks

Part 5: Serverless: Reusing common configurations across functions

--

--