Flask + Serverless — API in AWS Lambda the easy way

Michal Haták
Jan 1, 2019 · 4 min read
Image for post
Image for post

If you either need to quickly deploy small API or just decided to migrate your codebase to leverage advantages of AWS Lambda, you can use a powerful combo of Flask and Serverless framework. In fact, any WSGI application such as Django or so.

There are of course other solutions how to do that, either packing and uploading your app by yourself or using Zappa. The process gets a bit more tricky when you realize some dependencies are not compatible (because of Linux on lambda) and you to handle that too. We gonna take a look on basic setup of Serverless framework together with two amazing plugins to make it breeze.

Let's say you already have AWS account. The second thing you need is Serverless framework installed — its darn easy and you can follow the two-step guide here.

Now is time to get hands dirty and make it happen. But, we will need to have Flask app first. Let's build a simple random quote API. Its going to be a very simple example, but the process is very same for larger apps. Also, I will be using virtualenv, but it works with Pipenv too :)

# create new dir and jump in
$ mkdir quotes
$ cd quotes/
# create virtualenv, activate it
$ virtualenv venv -p python3
$ . venv/bin/activate.fish
# install Flask and freeze requirements
$ (venv) pip install Flask
$ (venv) pip freeze > requirements.txt

And that's it, we have the foundation now.

Image for post
Image for post

For the API itself, let's say we have mighty service like that, no rocket science.

Now its time to wire it with the Serverless framework (SLS). Because we have created our API first, it would be hard to use sls create command. On the other hand, that could be the case when we have code we want to just deploy or migrate. For having an SLS service we need to create serverless.yml file (in our root) manually. The file will look like this:

service: quotes

provider:
name: aws
runtime: python3.6
stage: dev
region: us-east-1
memorySize: 128

That's almost minimum you need to specify for having a declared service, although we don't have any link to our API (function) yet. But before we will do that, let's install two SLS plugins. The first thing we need is to make Lambda understand WSGI (protocol Flask/Django is using), second is to make SLS pack our python requirements into our deployment package.

$ sls plugin install -n serverless-wsgi
$ sls plugin install -n serverless-python-requirements

Those two commands make the job and we can see that SLS registered those two in our serverless.yml file.

Note: Serverless-wsgi plugin should be able to pack the requirements too, but I have found that the second plugin is more configurable and can eventually pack those in docker (if they need compilation and you are running non-linux system). Also, the plugin supports Pipenv which we are using on one of our projects.

service: quotesprovider:
name: aws
runtime: python3.6
stage: dev
region: us-east-1
memorySize: 128
plugins:
- serverless-wsgi
- serverless-python-requirements
custom:
wsgi:
app: app.app
packRequirements: false
functions:
app:
handler: wsgi.handler
events:
- http: ANY /
- http: 'ANY {proxy+}'

You can see a new section (custom) which holds configuration for those plugins. The wsgi part is saying where your app is and turning off packing of requirements. The last part (functions) declare what our service contains. We can have more function within one service and also require specific permissions. In this case, we are just saying that all requests will be served by through WSGI handler, which is provided by our installed plugin.

Local development

Before we deploy our API, we can verify everything works locally. As the serverless-wsgi plugin is smart, we can simply run sls wsgi serve to have local env. up and running :)

Image for post
Image for post
Local env. included

Deploy time!

Is as easy as type one command. Run sls deploy in your terminal and SLS will do the job. The first deployment will take a bit more, due to the initial setup, but any other one is fast enough.

Image for post
Image for post
Deploy & test

And that's it. We have our service up & running with the power of AWS Lambda behind.

Final Notes

  • you can easily make python-requirements use docker to pack requirements which require compilation
Image for post
Image for post
logs from our API
  • in most cases, you can fit into AWS free tier — which is nice especially for prototypes

I hope it helps you to start with Lambda and if it was clear enough. If you like the article, you can follow me on Twitter.

Welcome to a place where words matter. On Medium, smart voices and original ideas take center stage - with no ads in sight. Watch

Follow all the topics you care about, and we’ll deliver the best stories for you to your homepage and inbox. Explore

Get unlimited access to the best stories on Medium — and support writers while you’re at it. Just $5/month. Upgrade

Get the Medium app

A button that says 'Download on the App Store', and if clicked it will lead you to the iOS App store
A button that says 'Get it on, Google Play', and if clicked it will lead you to the Google Play store