Exposing your data science project to the world — Flask with Waitress!

Teena Jain
Brillio Data Science
6 min readMar 8, 2019

In this article, I’ll show how to expose your ML model through an API for integration with your web or any other application using open source libraries — Flask and Waitress.

Flask and Waitress : Production ready model for Windows

“Jack of all trades, master of none, though oft times better than master of one.”

One of the common pain points that we have come across in organizations is the last-mile delivery of data science applications. One common delivery vehicle is to create Business reports(BI). But the one, that’s very useful and neglected more often than not, is to create APIs and provide seamless integration with other applications within or outside the company. This requires you to have a basic understanding of machine learning, server-side programming and front-end application.

In this article, focus will be more on the last mile delivery of insights discovery i.e. Model Deployment where most of the data scientists get stuck often.

What & Why of Flask?

Flask is a web framework. This means flask provides you with tools, libraries and technologies that allow you to build a web application. This web application can be some web pages, a blog, a wiki or go as big as a web-based calendar application or a commercial website.

Why is Flask a good web framework choice?

Building an app with Flask is a lot like writing standard Python modules, except some functions have routes attached to them.This makes it easy to get started with as a beginner because there is little boilerplate code for getting a simple app up and running.

It’s easier to show than tell, so let’s start with a really simple “Hello, world!” web application with Flask.

WOW! BUILDING MY FIRST FLASK APP

Building simple “Hello World” application with Flask

Make sure to install “Flask” library before running above code.

Have you realized that you have already created your first flask app 😄

Still thinking……Let me explain

Just save above code snippet as hello.py (or something similar), paste below command in command prompt (in WINDOWS machine) and hit enter.

Now head over to http://127.0.0.1:5000/ in your browser, and you should see your hello world greeting.

Flask app running on localhost port 5000(default port)

Yes it was that simple !!

Note: Make sure to not give your application name flask.py because this would conflict with Flask itself.

So what happened behind the scenes?

Flask includes a very simple built-in web server, which is good enough for development and testing. With it you can make your app accessible on your local machine without having to set up other services and make them play together nicely. However, it is only meant to be used by one person at a time, and is built this way.

When running a web app in production, you want it to be able to handle multiple users and many requests, without those people have to wait noticeable amounts of time for the pages to load.

What now? Well, no need to be confused. All is fine, you just need a production ready web server which can serve your Flask app.

A Production Stack

If you want to run Flask in production, be sure to use a production-ready web server and let your app be handled by a WSGI application server like Gunicorn (UNIX), Waitress(Windows).

You will find several tutorials on Google for deploying flask app in production using Gunicorn but it does not run on windows.

There are number of ways to deploy flask app in WINDOWS but my favorite is WAITRESS as it scales well and can handle multiple request at a time.

Waitress WSGI Server

What is it?

Waitress is a pure-Python WSGI server. At a first glance it might not appear to be that much different than many others; however, its development philosophy separates it from the rest. Its aim for easing the production (and development) burden caused by web servers for Python web-application developers. Waitress achieves this by neutralizing issues caused by platform (ex. Unix vs. Windows), interpreter (CPython vs. PyPy) and Python (version 2 vs. 3) differences.

Why should you consider using it?

  • It is a very lean, pure-Python solution.
  • It supports HTTP/1.0 and HTTP/1.1 (Keep-Alive).
  • It comes ready to be deployed for production with a wide array of platform support.
  • It is framework-independent in its nature.
  • It runs on Windows and Unix.
  • It supports Python versions 2 and 3.

Isn’t this GREAT!!

Now let’s gets our hands dirty

All set to build and deploy more complex app!!

Setup:

Install flask and waitress by running following commands in command prompt-

Data-set and model training

Now let’s start with creating a really simple data science model. Here, we take in the classic iris data-set and create a linear model which correlates petal length to petal width.

As this is a data science article I assume user already has basic data science packages like sklearn, pandas etc. Now that we have all the required packages ,we can start building a model-

  • First, we create a script that loads the data and trains the model.The below script will do that:
Python code to load data and train linear model

Save it as predict_petal_length_controller.py

Now that we trained the model we can try testing it.

  • Set up a data frame containing one element — a petal width of 2. Then, run the model we saved on our input data to get the result.

The output will be

array([5.54343902])

Awesome! Unfortunately, to do this we had to manually alter the python code to pass ‘2’. What if the people running our model don’t want to learn python? Or maybe don’t want to learn to code at all?

In that case we can create a user defined function and call that function in our Flask app.

  • Add below code to existing predict_petal_length_controller.py
User defined function to predict petal length

Your final predict_petal_length_controller.py will look like this-

predict_petal_length_controller.py

Deploying model with Flask & Waitress

Finally, we need to set up our Flask app and run it using waitress.

  • To do that create new python file, save it with name “WebApp.py” and place it in the same folder where predict_petal_length_controller.py is present. Now copy below code in that file.
Python code to build Flask App running with Waitress server
  • Now run WebApp.py
  • To hit your API, open the browser to http://127.0.0.1:8000/ (8000 is the port we opened in waitress. You can change it also). Now you should see:

This is expected!The error code tells us exactly why though! We’re missing petal width, and the model can’t run on nothing. So let’s add our parameter petal width. This is done by adding a ‘ ?’ at the end of the navigation bar followed by the parameter name, an equals sign, and the value of that parameter. Let’s pass a petal width of 2. Now, our browser is pointing at http://127.0.0.1:8000/predict_petal_length?petal_width=2

BOOM! In your browser you should see

This is the same output we ran from python IDE. Congratulations! You’ve made your first python model and deployed it using flask served by waitress (production).

Now you can use your model as an API and integrate it with any other application or create a UI (simple with flask) for user to interact with your model in Browser.

Thank you for making it till here, Try it by yourself and comment below if you face any challenges or have any feedback. Happy Learning!!

--

--