Serverless Framework — Package Your Lambda Functions Easily
Now a days Serverless is rapidly growing Technology in Cloud Computing world. Most of the companies are moving from traditional architecture to either containerised architecture or serverlesss architecture.
AWS is one of the most popular and powerful cloud. It has lot of services and has a great support for serverless technologies.
AWS Lambda is one of the powerful service for running serverless workloads along with the combination of other service such as API Gateway, DynamoDB, S3, etc..
In Software development we all have a requirement of using some external dependencies, when it comes to deploy the code with external packages in AWS Lambda you create a zip file of your code and upload it to S3 bucket and then configure your lambda function to point to that location, but what if I say you don’t have to do all this instead you just focus on creating your code and a framework will take care of packaging and deploying your code to AWS Lambda. Sounds Interesting? If Yes, then continue reading this blog.
Pre-Requisite for Serverless Framework
- Access Key and Secret Access Key of IAM User.
- Docker Installed and Running in system
- AWS CLI
- Node
Let’s Get Started.
- Installing Serverless Framework.
Note: If you don’t already have Node on your machine, you’ll need to install it first. I suggest using the latest LTS version of NodeJS.
npm install -g serverless
Creating your service locally
A service
is like a project. It's where you define your AWS Lambda Functions, the events
that trigger them and any AWS infrastructure resources
they require, all in a file called serverless.yml
.
We can create a service from a template. I’m going to use Python 3.
serverless create \
--template aws-python3 \
--name harsh-test \
--path harsh-test
The serverless create
command will create a service. The --template aws-python3
will initialise our service with python3--name harsh-test
will be used as name of service and --path harsh-test
will create a directory named harsh-test to store our code and some other files from serverless framework.
serverless framework comes with some pre-defined template. You can read more about other available templates — https://www.serverless.com/framework/docs/providers/aws/guide/services/
This will create 2 files in our directory namely handler.py
and serverless.yml
file.
Creating Virtual Environment
we will create a virtual environment in the directory created by serverless framework. You can read here about how and why to use virtual environments with Python.
virtualenv venv --python=python3
If you don’t have virtualenv installed you can install it by using pip3 install virtualenv
Activate the Virtual Environment using the following command:
source venv/bin/activate
Let’s set up the function we want to deploy. Open handler.py
in your favourite text editor and add the following lines:
# handler.pyimport numpy as npdef main(event, context):
a = np.arange(15).reshape(3, 5)
print("Your numpy array:")
print(a)if __name__ == "__main__":
main('', '')
This is a simple function using an example from the NumPy Quick Start. When working with AWS Lambda Function, you’ll need to define a function that accepts two arguments: event
and context
. You can read more at AWS about the Lambda Function Handler for Python.
If we run python handler.py
, it will run our main()
function.
python handler.py
Ah, we haven’t installed numpy
in our virtual environment yet. Let's install numpy
in our virtual environment.
pip install numpy
I am using pip install numpy
but you can also write pip3 insatll numpy
While creating virtualenv we have mentioned that we want to create virtualenv using python3
, so in our virtualenv pip
will use python3
to install packages.
Let’s create requirements.txt
file.
pip freeze > requirements.txt
Let’s run our code now.
python handler.py
Congratulations!! 👏 We have successfully ran our python code locally.
Deploying our serverless service
Our function is working locally, and it’s ready for us to deploy to Lambda. Open the serverless.yml
file into your favourite text editor and write the following lines:
# serverless.yml---
service: harsh-testframeworkVersion: '2'provider:
name: aws
runtime: python3.8functions:
hello:
handler: handler.main
This is a basic service called harsh-test
. It will deploy a single Python 3.8 function named hello
to AWS, and the entry point for the hello
function is the main
function in the handler.py
module.
Our last step before deploying is to add the serverless-python-requirements
plugin. Let’s create a package.json
file for saving your node dependencies.
This command will create package.json
file, accept all the default values.
npm init
Now we will run one more command to install `serverless-python-requirements
pacakge
npm install --save serverless-python-requirements
To configure our serverless.yml
file to use the plugin, we'll add the following lines in our serverless.yml
:
[...]plugins:
- serverless-python-requirementscustom:
pythonRequirements:
dockerizePip: non-linux
The plugins
section registers the plugin with the Framework. In the custom
section, we tell the plugin to use Docker when installing packages with pip. It will use a Docker container that's similar to the Lambda environment so the compiled extensions will be compatible.
One last step before doing the deployment is to configure our IAM User’s keys.
aws configure
It will ask you for your Access_Key_Id and Secret_Access_Key of IAM User. Configure values properly.
Finally we will deploy our Function:
serverless deploy
Congratulations! 👏 You have successfully deployed your AWS Lambda Function using Serverless Framework.
Using Variables
Till now we have created our basic lambda function deployed it using serverless framework with one simple command.
Let’s take one step ahead and make our environment a bit dynamic.
Consider a scenario where you want to deploy the same function in multiple Environments. We will use some custom variables inside our serverless.yml
file to deploy our code to multiple region.
Open the serverless.yml
file and add the following lines:
- Adding Custom Variables
[Previous Content.....]custom:
defaultEnv: dev #Env Specific Variables dev:
region: us-east-1
profile: dev staging:
region: ap-south-1
profile: staging pythonRequirements:
dockerizePip: non-linux
In our serverless.yml
file we have key called custom
. In this we have mentioned defaultEnv: dev
and we have also created dev
key and we have specified we want to use us-east-1
region and dev
profile. Similarly for staging
Environment. You can read more about variables in serverless framework here.
- Consume Variables in our Provider.
[Existing Content.....]provider:
name: aws
runtime: python3.8
stage: ${opt:stage, self:custom.defaultStage}
region: ${self:custom.${self:provider.stage}.region}
profile: ${opt:profile,self:custom.${self:provider.stage}.profile}[Existing Content....]
In our provider
section we have written stage: ${opt:stage, self:custom.defaultStage}
which means that we can pass the value of stage while running command serverless deploy
or it will user the default stage. For Example if you write serverless deploy --stage staging
so it will be translated to stage: staging
. You an read more about available options for AWS Provider here. The keys which I am using in provider
is all from that documentation and we are making it dynamic.
- Now we will deploy our Existing function to Stage Environment.
We need to run AWS Configure again to configure the credentials for our stageing
Environment.
aws configure --profile staging
You have to configure Access_Key_Id and Secret_Access_Key of Staging environment IAM User.
- Final
serverless.yml
file:
---
service: harsh-testframeworkVersion: '2'custom:
defaultStage: dev dev:
region: us-east-1
profile: dev staging:
region: ap-south-1
profile: staging pythonRequirements:
dockerizePip: non-linuxprovider:
name: aws
runtime: python3.8
stage: ${opt:stage, self:custom.defaultStage}
region: ${self:custom.${self:provider.stage}.region}
profile: ${opt:profile,self:custom.${self:provider.stage}.profile}functions:
hello:
handler: handler.mainplugins:
- serverless-python-requirements
Deploy Our Function to Another Environment
Once you are done with the changes in serverless.yml
file we just need to run one command and It will deploy our function to Environment.
serverless deploy --stage staging --profile staging
Note: — In my case my Environment name is staging and my AWS Profile name is also staging. That’s why in our command the stage name and profile names are same. The --profile
corresponds to AWS Profile while configuring AWS Credentials.