Publishing a TensorFlow model on AWS Lambda to make a sell-able API

In Data Science, a productive model is an amazing work from you. You want to sell your excellent model to a Financial Business or a potential StartUp through the API subscription method. AWS Lambda and API Gateway will be the best choice for you. Deploying your model into production and scale-out them for thousands of thousands of users is almost done if it works on Serverless architecture. Doesn’t like deploying on an EC2, it costs you nothing if there is no usage.

Why I cannot use TFLite with AWS Lambda?

TFLite has a small footprint, so what is the troubling thing to use with Python running on a Lambda function? It’s all about the native builds for different processors, OS platforms, and customized kernels. AWS Lambda runs on its own, customized Linux so that, the pre-build TFLite from the community doesn’t work.

What can I do to make one of them works?

Let’s take TensorFlow to be the case. Cook it yourself to fit your needs! That’s all the stuff you have to do. But cook the Tensorflow Lite from SourceCode in the Amazon Linux platform resulting to have a compatible binary with AWS Lambda runtime. How to build a native library, cross-platform from SourceCode in my PC, this was a nightmare story when I was in the year of 2000. Now with Docker and the community, you are happy and very lucky. It is done easily within 15 minutes. Let’s create a Dockerfile:

FROM amazonlinuxWORKDIR /tfliteRUN yum groupinstall -y development
RUN yum install -y python3.7
RUN yum install -y python3-devel
RUN pip3 install numpy wheel pybind11
RUN git clone --branch v2.3.0 https://github.com/tensorflow/tensorflow.gitRUN sh ./tensorflow/tensorflow/lite/tools/make/download_dependencies.shRUN sh ./tensorflow/tensorflow/lite/tools/pip_package/build_pip_package.shRUN pip3 install tensorflow/tensorflow/lite/tools/pip_package/gen/tflite_pip/python3/dist/tflite_runtime-2.3.0-cp37-cp37m-linux_x86_64.whlCMD tail -f /dev/null
docker build -t tflite_amazonlinux .

A Pre-built TFLite library for AWS Lambda:

Simplify CLI offers you a tool to create a Serverless project, manage the deployment and its layers gracefully. Now, let’s create a Lambda function with “simplify-cli”.

npm install -g simplify-cli         # install serverless frameworkmkdir tensorflow-lite               # create a project folder
cd tensorflow-lite # enter this project folder
simplify-cli init -t python # generate a python project
  • numpy (1.19.1)
### - Application DeploymentDEPLOYMENT_ENV=demo
DEPLOYMENT_BUCKET=tensorflow-deployment-37216
DEPLOYMENT_REGION=eu-central-1
DEPLOYMENT_ACCOUNT=your-aws-account
DEPLOYMENT_PROFILE=your-aws-profle
### - Application StackName
PROJECT_NAME=TensorFlowTest
### - Backend Serverless Lambda
FUNCTION_NAME=detectObject
FUNCTION_RUNTIME=python3.7
FUNCTION_HANDLER=main.handler
FUNCTION_SOURCE=src
simplify-cli deploy
simplify-cli deploy --layer --source layer

Leave your comment if you need a version of “tflite_runtime” for NodeJS on AWS Lambda.

Follows my articles

--

--

Albert Einstein has solved the problem of “time”, “light” and “space” by his curious thinking then answered by himself. In daily life, problems come every time and each person has their solution. By sharing your solution to people, you are contributing to creating the best idea.

Get the Medium app

A button that says 'Download on the App Store', and if clicked it will lead you to the iOS App store
A button that says 'Get it on, Google Play', and if clicked it will lead you to the Google Play store
Dinh-Cuong DUONG

(MSc) Cloud Security | Innovator | Creator | FinTech CTO | Senior Architect.