Show Off Your Machine Learning Project on Web Part 2: Dockerize You Streamlit App

A Docker a day keeps your boss away.

Lu-Hsuan Chen
Analytics Vidhya
2 min readJun 29, 2020

--

Visit part 1 and part 3 by clicking them.

Use this friend link to spread this post with unlimited access.

TL;DR You can get the example project here.

Docker your app and say goodbye to environment setting time!

Motivation

So you finally finish your machine learning project with streamlit for presentation, but you fell tedious that you have to create a virtual environment every time on a new computer. What’s worse, sometimes your virtual environment breaks because you are using different OS.

Hopefully, we can use Docker to ease the problems of setting up and deploying environments, and you create a microservice which is suitable for scaling.

Goal of This part

For this part, I will show how to create a workable streamlit + OpenCV Docker Image, and show some simple usage about Docker.

Dive in with An Example

Long story short, let me show the Dockerfile of my streamlit + OpenCV machine learning project:

Some Dockerfile Commands Usage Simplified

  1. FROM: Get a base image. Just like you need an OS as a basis of your application.
  2. MAINTAINTER: Show message about the author of this image.
  3. ENV: set an environment variable to a certain value.
  4. RUN: execute the commands.
  5. EXPOSE: to let your container listens on the specific port while running.
  6. WORKDIR: set the working directory.
  7. COPY: copy new files/directories.
  8. CMD: to provide default actions when executing container.

For more information, visit official Dockerfile reference.

Some reminders about This Dockerfile

  1. The purpose of line 13~16 is to let streamlit runs normally in Docker. Otherwise, streamlit will fail to launch.
  2. The purpose of line 24 is to let OpenCV runs normally in Docker. I encountered an issue that if you do not install libSM.so.6 then import OpenCV when building Docker Image, your container will silently crash with this message:segmentation fault (core dumped) when executing. You can see this kind of issue here.

Build and Execute This Container

Use the following commands to build and execute this container:

After that, type localhost:8501 in browser's URL, and the content of app will show up.

For shutting down the container type:

Warping Up

In this part, I show the Docker configuration of this project, and tell you two problems when I was building and executing my built container.

Originally published at https://cuda-chen.github.io on June 29, 2020.

If you have any thoughts and questions to share, please contact me at clh960524[at]gmail.com. Also, you can check my GitHub repositories for other works. If you are, like me passionate about machine learning, image processing and parallel computing, feel free to add me on LinkedIn.

--

--

Lu-Hsuan Chen
Analytics Vidhya

Enthusiastic of image processing, machine learning, and parallel computing. Current status: beggar on the street.