Deploy Deep Learning Model Using Docker, Tensorflow Serving, Nginx and Flask (Part 3)

Ashish Kumar
TheCyPhy
Published in
6 min readAug 30, 2020
Photo by Luca Bravo on Unsplash

Hello curious person, so now it’s time for the final showdown. We have done so many things till now. Now it’s time to end our journey and reach the final destination. In this part we will be creating two services, i.e Tensorflow Serving and Flask.

First we will take Tensorflow Serving

Create a “folder” named “tensorflow-serving” inside “services” folder. Now create a file named “model.config” inside “tensorflow-serving” folder. This “model.config” is required as we are going to serve two deep learning models with the same docker container. We have a “model_config_list” dictionary in our “model.config” file which gives information about the two deep learning models which we are going to deploy

model_config_list {
config {
name: "my_model"
base_path: "/models/my_model/"
model_platform: "tensorflow"
}
config {
name: "detect_model"
base_path: "/models/detect_model/"
model_platform: "tensorflow"
}
}

As we can see that we need two models for it. I have already trained models for it which you can download from this link. Now create a file named “Dockerfile.tf” which is going to setup our Tensorflow-Serving service.

FROM tensorflow/servingCOPY ./my_model/ /models/my_model/COPY ./detect_model/ /models/detect_model/COPY ./model.config /models/

Our “Dockerfile.tf” is pretty simple, basically it fetches the tensorflow serving container and copies the trained models inside the container. Please Note, I am running the container as root user because of the reason I mentioned in my previous post. It is recommended to run container as non root user for security purpose

We are done with our Tensorflow-Serving service. It’s time to get done with the final service i.e Nginx

Create a folder named “nginx” inside “services” folder. Nginx service will act as a web server for our Web application. We can use Flask directly to act as a web server but Flask is not suitable for handling all types of request and for better performance and reliability Nginx is recommended.

Create a file named “nginx.conf” inside “nginx” folder. This file will configure our nginx and also provide the https connection to our web application.

upstream classify {server web:5000;}server {listen 443 ssl;ssl_certificate /certificate/cert.pem;ssl_certificate_key /certificate/key.pem;error_page 497 https://$host:1337$request_uri;location / {proxy_pass http://classify;  # upstream name defined aboveproxy_set_header X-Forwarded-For $proxy_add_x_forwarded_for;proxy_set_header Host $host:1337;proxy_redirect off;}}server {listen 80;server_name classify;location / {return 301 https://$host:1337$request_uri;}}

As we can see that in Line 6- 7, we need certificates to provide https connection to our web application. So let us create our certificates. I am going to create certificates in Windows system. For a Linux or a Mac system the instructions will be same just ignore the two ‘set’ commands in the beginning.

First download and install OpenSSL for Windows from this link. Add the OpenSSL bin folder to environment variables. Open command prompt in any folder then run these below commands

set RANDFILE="some folder location\".rnd
set OPENSSL_CONF="path to\"openssl.cfg
openssl req -x509 -newkey rsa:4096 -nodes -out cert.pem -keyout key.pem -days 365

Two files “cert.pem” and “key.pem” will be created in your current folder location. Copy these two files inside “nginx” folder. These files will help us provide https connection to our web application. Please Note, as these certificates are created by you, it will be invalidated by your web browser. You will get a warning when you will try to open your web application in your web browser. You can ignore that warning and proceed to run the Web application

At the last, We will create “Dockerfile.nginx” for our Nginx service. Basically we are overriding the default nginx configuration file with our nginx configuration file and copying the certificates created by us.

FROM nginx:1.17-alpineRUN rm /etc/nginx/conf.d/default.confCOPY nginx.conf /etc/nginx/conf.dCOPY cert.pem /certificate/COPY key.pem /certificate/

We are done with all the services i.e Web, Tensorflow-Serving and Nginx. Now we have to create a docker-compose file which will set up all this services together and allows each service to communicate with the other service

Create a file named “docker-compose.yml”

Now let us understand our “docker-compose.yml” line by line

At Line 1, we declare the docker-compose version, In our case it is 3

Line 4- 13, we create the service tensorflow-serving, At line 5- 7, we specify the dockerfile and its location. At line 9 we specify the model.config file

Line 15- 23, we create the service web, At line 20, we are specifying volume named “sql_data”. It is a persistent volume which will store the “data2.db” file as we dont want to lose our Users information every time we restart our docker containers. At line 21, we are using gunicorn to serve our flask app. Gunicorn is a Python Web Server Gateway Interface HTTP server. It is a recommended way to serve Flask application.

Line 24- 29, we create the service nginx, At line 29, we specify the ports which are accessible to the user. In our case it is 1337. You can change it to any allowed value unless the value is not used by any other service in your system.

Line 32, we create our persistent volume named “sql_data”

Now we are done with the “docker-compose.yml”. This file really look so clean and beautiful.

Let us confirm our folder structure now to confirm that every file is in the right place

services
|---docker-compose.yml
|---nginx
| |---cert.pem
| |---Dockerfile.nginx
| |---key.pem
| |---nginx.conf
|---tensorflow-serving
| |---detect_model
| |---my_model
| |---Dockerfile.tf
| |---model.config
|---web
| |---flask
| | |---database
| | |---static
| | | |---index.png
| | |---Template
| | | |---base.html
| | | |---index.html
| | | |---login.html
| | | |---result.html
| | | |---signup.html
| | | |---start.html
| | |---app.py
| | |---database.py
| | |---dbmodels.py
| | |---Dockerfile.flask
| | |---manage.py
| | |---model.py
| | |---requirements.txt

Note inside the folders “detect_model” and “my_model’ you have to place object detection model files and classification model files respectively which you can download from this link. If your folder structure looks the same then well done, you have done a great job and completed this project successfully.

Now we have to get this project running and to run this we just need to get inside “services” folder and execute one command from the command prompt and that is

docker-compose up

I am attaching a video below where you can see our Web App in action

So in the above we can see our classification model in action but what about the Object Detection Model ?

We can call our Object detection Model and get the prediction result back via API call. So let us create “api.py” to do that

Let us understand our “api.py” file,

Line 1- 3, we are importing the required python packages

Line 5- 8, We are reading the input image, sending it to the Object Detection Model and getting the prediction results back.

Line 10- 18, We are annotating the input image with the bounding boxes and the detected classes and saving the image.

For example if the Input Image is as below

Input Image

We will get below image as Output

Output Image

This model is not perfect and requires some more training but we made a full fledge Web application to classify images and also detect objects in the images.

It was quite a long journey with so much learning but we were persistent and made it till the end. We reached our final destination. I hope this article was quite useful to you and made you learn some new things. Keep your curiosity alive and keep going ahead.

I will be back with more interesting articles like this, till then Enjoy your day :)

Extra Reading:

If you want to learn how to train a object detection model you can refer this link.

--

--

Ashish Kumar
TheCyPhy

I write and share interesting articles. Follow me if you are an avid reader. Connect with me at https://topmate.io/ashish_kumar17