Beyond Jupyter Notebooks

Part 2: Non-ML Model deployment with Streamlit and Docker

Greg Jan
Nerd For Tech
4 min readFeb 3, 2021

--

Welcome to part 2. In this article you will learn how to deploy a very simple non machine learning model with Streamlit and build a container with Docker building on the work from Part 1.

More precisely we will:

  • Create a web app serving our model
  • Containerize our web app

And we will not Create a production ready code with our non-machine learning model nor Test our model since this part has been covered in Part 1.

Streamlit app

Streamlit is an open-source Python library that makes it easy to create and share beautiful, custom web apps for machine learning and data science (their own words!). Personally I find it very convenient to easily build a good looking interactive app. The downside would be scalability (think high volume and high throughput of data) and complexity (for this matter I would use Dash).

First let’s install streamlit to our conda environment :

Now we can write our code for the app itself:

As you can see this is way simpler and neater than a Flask app (see Part 1). It breaks down to 4 parts:

  1. Import streamlit and load our Modeler class (again see Part 1 for the details)
  2. Set the title of your app
  3. Make a slider to select a value x
  4. Print the output of our predict function from the class Modeler with x as input. It implicitly uses a callback function whom roles is to automatically call predict whenever the input component’s property changes.

Is that it? Yes pretty much! To launch the app just write in your terminal:

So now you should see something like this:

Follow the first URL and it will open your app:

I believe 20² = 400 so the app is working as expected.

Containerizing with Docker

So all we need now is a Dockerfile to configure our container (given that Docker is installed). You can place the following file on the root of your project folder:

The base image used here is python 3.7-slim (which is Debian based). All the RUN commands update the base image with the latest packages. As you can see the first ones are dedicated to set up streamlit on the container. Two critical lines in the Dockerfile are to copy and install all the python libraries from the requirements.txt file. This means that the same library versions will be used on the container as on your local anaconda environment. To create this file run in a terminal (where your anaconda environment must be activated):

The COPY line will then copy all the files and folders to the container. Finally the CMD line will launch the web app by running streamlit on the app.py job.

Ok now we are all set. Let’s make that container:

It may take a bit of time depending on the speed of your connection. But once it is done you can launch the container:

And check your app by typing localhost:8501 in your browser.

Again 50² = 2500 seems about right.

Now we could register this container to any Cloud vendors such as GCP, Azure, AWS or streamlit own deployment called streamlit sharing and make it accessible to the world. This is not covered in here.

Even though our model was very simple we got a nice looking and functional web app. Once again Jupyter Notebooks were not used at all. Instead we have used production code and configuration files.

You can access all the codes in my GitHub repo: https://github.com/GregoireJan/xstreamlit

In the next article we will make our first machine learning model together with its Flask web app!

Here are the parts of the series Beyond Jupyter notebooks:

--

--