AWS Lambda — Python Library Zappa Unzipped

Function as service(FaaS) saves lot of money in comparison to continuously running EC2 instance.However the path to Fass is not smooth.There are handful of libraries to make life easy, however they have their own down side.Few notables are Apex(http://apex.run/) and Zappa(Python specific).The blogposts are very few detailing how to create functions through commands , as most of them do through AWS UI.I would like to talk about this and other learnings, working around zappa.

Python fits into lambda philosophy more than any language owing into brevity and simplicity built into language.

Here are my learning:

  1. Choice of platform:

AWS lambda runs in its own world(container). It is very important that source code and libraries are are zipped should be compatible with environment that is run. If you have run that locally in your machine and thinking that it would work well , i am sure you wont be lucky all the time. It is always recommended to built with in a container.

Dockerfile shall be something like this

docker-compose.yaml

2. Time Zone selection.

While deploying if the image TZ is different from AWS Lambda region, you deployment fails, it is important that you tune your container TZ to AWS region

RUN ln -snf /usr/share/zoneinfo/$TZ /etc/localtime && echo $TZ > /etc/timezone
this link is of great help if you are running on OSX.
https://stackoverflow.com/questions/22800624/will-docker-container-auto-sync-time-with-the-host-machine

3. Separate Settings files for each Environment

It is always preferable to create separate zappa settings file for each region like below. otheroption is to generate zappa_settings.json on fly using a python script and feed it to zappa deploy

4. Passing Environment variables to AWS.

AWS lambda runs its own container which we do not have any access to. if you want to supply some enviornment variables, please do so by writing into settings- aws_environment_variables.

5. AWS Roles and permissions

zappa creates its own cloud-formation file which includes roles. It is very unfortunate that we can’t feed our own template to it. Because in any practical case we will be using many other AWS resources which need explicit policies to access.So it is advisable to create your own cloud formation script and feed that role and S3 bucket information to zappa by chnaging the zappa settings.

6. Deployment

Prefer to do it through some make file or with below command.

On the downside:

  1. On the CI/CD side zappa scores very few marks .For that matter, most of the serverless libraries.
  2. Zappa deploy is not two step process. It zips the source code along with libaries and puts in S3 and then grabs form S3 and runs in AWS container. For all practical purposes, deployment should be two step process. 1. build 2. deploy. Sorry , i could not find anyway to separte this step and zappa deploy wont take zip file as an input.
  3. If you try to build in local virtualenv, it does not have any clean way to zip source and libraries. It times it will try to take from local caches and some of the modules were not shipped, resulting into “no module issues”. I recommend to use container it is always cleaner.