Serverless your Machine Learning Model with Pycaret and AWS Lambda

Rafał Bodziony
Analytics Vidhya
Published in
6 min readAug 8, 2021

Machine Learning Engineer’s life is not simple. We start with incomplete requirements and data, design many experiments. Finally, build a highly efficient and scalable inference process. More and more is being said about serverless in terms of Machine Learning. We want to make efficient and state-of-the-art solutions that should additionally be flexible and easy to implement. Impossible? I suspect so, but the strategy in which we focus on creating profitable solutions for business gives us a significant advantage during the design process. Serverless architecture in the context of Machine Learning seems ideal because we have:

  • No administration over infrastructure
  • Automatic resource scaling
  • Payment for what we use

If you are convinced by these three basic arguments — keeps going. In this article, we will build a simple httpAPI based on serverless architecture. For this purpose, we will use:

  • Model built based on the AutoML — Pycaret library
  • Serverless Framework for automating architecture build
  • AWS Lambda and API Gateway to provide an endpoint for our solution

Let’s go!

Long story short

Let’s assume that the CEO of a growing startup whose infrastructure is based entirely on AWS wants to be our client. The business value of `AirRent` is to help in carrying out the rental process on Airbnb. They want to create a quick proof of concept and check whether it is possible to estimate the price of renting a flat and add a maintenance fee for real estate in Amsterdam. At the moment, the client doesn’t care about the model itself, but the costs and how quickly PoC could be performed. We undertake the job and state minimum plan:

  • Creating a simplified Machine Learning experiment with EDA, Feature Engineering, Model Selection, and Explainability based on AutoML solutions
  • Serverless Framework for developing the entire pipeline for implementations a serverless machine learning architecture
  • Implementation of AWS Lambda with a dedicated container for making price predictions and forecast

Full source code is available on Github

PyCaret AutoML

Pycaret Modules

If you are not familiar with the PyCaret library, which was designed for low-code machine learning. I highly recommend checking the Moez Ali articles, who is the main creator of this great framework.

Automate Exploratory Data Analysis

Let’s assume that we would like to download data and create a simple dashboard for Exploratory Data Analysis. For this purpose, the best solution can be Pandas-Profiling

The script download data from insideairbnb.com saves it and generates the basic EDA analysis. Now, we can analyze basic information about the data and decide which is worth using later in the experiment. Goes pretty fast right?

Output from Pandas Profiling in HTML widget

We can conclude that many of the variables are useless. All, after all, a typical user doesn’t have number of reviews orreviews_per_month . We have to delete this information from the data. Additionally, the price seems to have extreme values between 0–25 and 500–10 000. This is not our case, we focus on POC, so we decided to drop extreme records. Furthermore, we should think about attitude and longitude in terms of model, is it good enough features to describe the position in feature space? In my opinion — no, wherefore we will prepare three additional attributes based on trigonometric functions (inside base_processing)

Automate Machine Learning Model Development

We’ve come to the point of modeling, as I said we will use a semi-automated mechanism from the PyCaret library. First, we drop unnecessary columns and create multiple features inside setup function. I really encourage you to look at PyCaret documentation to play with different settings. To clarify, we create multiple polynomials, trigonometric, and interactions between numeric columns. Combine categorical levels and at the end selects the most important features based on the Boruta algorithm. Finally, we compare multiple models, choose best, optimize hyperparameters and save the final one to S3. Of course, we want some level of understanding so we will generate a few plots to interpret our model.

Model Interpretation

Not so good… our model cannot achieve acceptable results, furthermore sometimes we underestimate the price of apartments. What is interesting in five of the most important features, three are generated manually. The power of human thinking is invaluable. But remember this is only an example of the very unusual and challenging dataset with little work.

After presenting the first results, the CEO decides to deploy the solution because has a presentation with investors. Hard startupper life is not an easy one😂

Interpretation of the first experiment results

Serverless Framework and AWS

Serverless Framework

Serverless Framework is an open-source tool to automate the deployment of cloud infrastructure. If you are not familiar with it, feel free to look at the series of courses and tutorials. Basically, the framework makes it a possible prototype and develop cloud application really fast, compare it to training Deep Learning model on GPU instead of CPUs. Below I describe the architecture diagram for our model. Because lambda can have ~250Mb layer only. I decided to create a container with a custom image where PyCaret would be installed. Why not? With Serverless Framework it’s fairly easy to deploy lambda with custom containers doesn’t matter whether we want to use PyCaret, Tensorflow, or PyTorch.

Architecture Diagram

Lambda handler

The main lambda handler contains 4 lines of code, isn’t beautiful, is it? We decided to use aws_lambda_powertools to adapt the custom capabilities of input/output rules. In the beginning, the Pycaret model from S3 is loaded. Next, we create data models to catch events. Finally, the main handler support inference operations.

Docker container

As I said lambda can handle layers lower than ~250 Mb, so we have to create a custom docker container with preinstalled PyCaret. Thanks to containerization we can put all necessary libraries and custom code. Remember to put the lambda handler inside LAMBDA_TASK_ROOT catalog. We don’t need to build and push containers to ECR, Serverless Framework will do it for us (if we have appropriate permissions).

Serverless Configuration File

It’s fairly easy to build infrastructure with Serverless. We have two base regions of interest.

  • provider — used for pick all information about stack, we can define runtime , environment variables , policy and docker image name which will be pushed to ECR.
  • functions — used to define lambda functions. We can set timeout , memorysize , events on which our lambda wake up

Finally, after local testing (I’m using AWS Toolkit for PyCharm) we simply type sls deploy and that's it! The stack should be deployed successfully

Summary

In the end, we can test our lambda inside the AWS console. The first call may take longer because we have to download a model from S3, but next have billed duration around 250ms/300ms. This means that we can call our lambda 100 000 times and pay around 3$. I think it is worth doing!

Maybe the model is not good enough to get it to the production environment. But with more data and more analysis, we may create a better model which gives reliable price forecasts. Importantly we use a semi-automated process to deploy a Machine Learning model with minimum effort and very good price per prediction (good metric for machine learning).

Thank you for reaching the end of the article! I think serverless could help in many projects where we have to implement Machine Learning models to do one specific task. After this article, you may have a basic understanding of how to deal with large lambda layers, serverless framework, and PyCaret functionality.

If you liked it, leave clap or tweet at @BodzionyRafal to say hello✌️

--

--

Rafał Bodziony
Analytics Vidhya

Data Scientist focused on solving real-world problems with Machine Learning. Recent research includes Time Series Forecasting and Image Recognition