How to make Dockerfile, build Docker Image, and run Docker Container — Part 2 of 2
This how-to article is about making a Dockerfile, building a Docker Image, and running a Docker Container.
If you haven’t already installed Docker on your System, please follow Part — 1 of 2 of this Series to install. I’ve spent a good amount of time explaining why we need Docker, especially for a Data Science model. Click on the below link to revist:
What I covered:
- Pre-requisites
a. Virtual Environment
b. Python file (.py) & API of a Data Science Model
c. requirements.txt - Making a Dockerfile
- Building Docker Image
- Running Docker Container
- What are you waiting for?
- How to reach FIO Labs:
- References
As you can see from the above diagram when the Dockerfile is built, it becomes a Docker Image and when we run the Docker Image then it finally becomes a Docker Container. To understand these 3 terms, keep reading. From here on, we will be assuming you’ve already built your Data Science model or project.
Pre-requisites
For this demo, we need the following files ready:
- Virtual Environment for your project
- Python file & API of a Data Science Model
- requirements.txt
Virtual Environment
When you are creating a project, be sure to isolate the project’s dependencies by working on a Virtual Environment. You need pip
installed for that. To create a Virtual Environment for your project, follow the 3 steps below:
- Go to your project folder
cd myproject/
2. Install the package which helps you to create a virtual environment
pip install pipenv
3. Create a virtual environment.
pipenv shell
This will create a virtual environment especially for your Data Science Model/Web App.
Python file (.py) & API of a Data Science Model
Since we’re not concentrating on how to create a Data Science model in this article, I’m not going to discuss much about the steps which are involved in it. It goes on to say that the dataset fed into the model contains employee profiles of a large company, where each record is an employee. The model predicts which employee has left and who stayed. The API for this model is built using FastAPI.
You can copy the model and API code from here.
requirements.txt
Once you’re done building all the models for your project, go to the folder and use the following code, you can get the requirements.txt file:
pip freeze > requirements.txt
Making a Dockerfile
“Docker can build images automatically by reading the instructions from a Dockerfile
. A Dockerfile is a text document that contains all the commands a user could call on the command line to assemble an image.”
On to the interesting part, let’s start coding!
Go to your project folder and create a Dockerfile:
touch Dockerfile
Now, in the Dockerfile, write the following code.
# Use the official image as a parent image.
FROM tiangolo/uvicorn-gunicorn:python3.6-alpine3.8
A valid Dockerfile
must always start with a FROM
instruction. It initializes a new build stage and sets the Base Image for subsequent instructions. Here, we’re using the base image as an alpine image which is recommended as it is small in size. More about Docker base images can be read on the link below:
LABEL MAINTAINER="Vidya P <vidya@fiolabs.ai>"
You can add labels to your image to help organize images by project, record licensing information, to aid in automation, or for other reasons. I am adding myself as MAINTAINER
of this Dockerfile.
# Make directories suited to your application
RUN mkdir -p /home/project/app
WORKDIR /home/project/app
The RUN
instruction will execute any command in a new layer on top of the current image and commit the results. The resulting committed image will be used for the next step in the Dockerfile
. In the above lines, we’re creating a directory for our application to run.
The WORKDIR
instruction sets the working directory for the RUN
instruction we provided. This is to make sure we’re running our application in the intended directory.
# Copy and install requirements
COPY requirements.txt /home/project/app
RUN pip install --no-cache-dir -r requirements.txt
The COPY
instruction copies new files or directories from the source, in our case, it is requirements.txt
file and adds them to the filesystem of the container at the path /home/project/app
. Once we copy the required files and directories, we would like to install all the packages which are mentioned in the requirements.txt
file by using the RUN
command.
# Copy contents from your local to your docker container
COPY . /home/project/app
Save the Dockerfile.
Full code for Dockerfile.
# Use the official image as a parent image.
FROM tiangolo/uvicorn-gunicorn:python3.6-alpine3.8LABEL MAINTAINER="Vidya P <vidya@fiolabs.ai>"# Make directories suited to your application
RUN mkdir -p /home/project/app
WORKDIR /home/project/app# Copy and install requirements
COPY requirements.txt /home/project/app
RUN pip install --no-cache-dir -r requirements.txt# Copy contents from your local to your docker container
COPY . /home/project/app
The above steps conclude us making a Dockerfile. Let’s build it.
Building Docker Image
Navigate to the folder where your Dockerfile is located. Follow the steps:
docker build -t myimage ./
The output should be something like this:
Running Docker Container
Start a container based on your new image.
docker run -d --name mycontainer -p 80:80 myimage
- - name
specifies a name with which you can refer to your container in subsequent commands, in this case mycontainer
.
80:80
specifies that your container is running on port 80
and your host port is also 80
. The syntax for -p
is HOST_PORT:CLIENT_PORT
.
Your API is now ready and can be accessed from http://127.0.0.1/docs.
Below are some screenshots to give you a glimpse of the cool UI which is accessible using FastAPI.
Congratulations! You’ve successfully learned how to make a Dockerfile, build a Docker Image, and run a Docker Container. Let us know this how-to article helped your Data Science project. If you have any questions please leave a comment below and we will try to provide a solution to your queries.
What are you waiting for?
We believe FIO Labs never fails to keep their promise when it comes to providing quality services. Our enterprise expertise and industry leadership mean you’re in safe hands.
If you are interested in learning more about what we do at FIO Labs or have some questions about this page, feel free to send us a message to contact@fiolabs.ai — we’d love to hear from you.
How to reach FIO Labs:
Leave a comment below | Book a FREE 30-min session for our on-going Pro Bono Services or Fill in our LinkedIn Form| Contact Us | About FIO Labs | Blog
References
- Dockerfile reference. (2020, April 24). Retrieved from https://docs.docker.com/engine/reference/builder/
- Grootendorst, M. (2019, August 30). How to Deploy a Machine Learning Model. Retrieved from https://towardsdatascience.com/how-to-deploy-a-machine-learning-model-dc51200fe8cf