Deploying Classification Model using Flask

manohar krishna
Analytics Vidhya
Published in
4 min readNov 16, 2020

In my previous article, I’ve described the process of building an Image Classification model using Fast.ai. In this article, let’s see how to deploy it on a web application made out of Flask.

source

Small Background of the model that we’ve built — Classification of vehicles into emergency and non-emergency. Now that the model is built, what next? How do you want to use your trained model? A simple answer would be to build a web application where you would be able to pass the image and ask for prediction. There are 2 python based options for web development: Flask & Django. I chose Flask since it is light weight compared to Django and also because my deployment will only be based on request/response in text form. Django has lots of things which might not be needed for my webapp and also one can do anything and everything in Flask that can be done in Django.

So this article starts where the last article ended i.e., after getting the “.pkl” file of the model. We built the webapp which runs the trained model in the backend to get the prediction. So let’s get started…

The 2 key files in the web application are

  1. predict_app.py — which contains the back-end flask server code and the logic to predict results using the trained model
  2. predict.html — which contains the front-end html code to view results on the web page

** If you don’t have Flask already installed on your environment, you could install using “pip install flask”

The following code is from predict_app.py

# -*- coding: utf-8 -*-
"""
Created on Sun Nov 1 05:03:08 2020
@author: manohar
"""
import base64
import numpy as np
import io
from PIL import Image
from flask import request
from flask import jsonify
from flask import Flask
import json
from pathlib import Path
from skimage import transform
import fastai
from fastai import *
from fastai.utils import *
from fastai.vision import *
from fastai.callbacks import *
app = Flask(__name__)def get_model():
learn = load_learner('.', file = 'av_cv.pkl')
print(" * Model loaded!")

print(" * Loading keras model...")
get_model()
@app.route("/predict", methods=["POST"])
def predict():
message = request.get_json(force=True)
encoded = message['image']
decoded = base64.b64decode(encoded)
img = Image.open(io.BytesIO(decoded))

# img = np.array(img).astype('float')/255
# img = transform.resize(img,(512,512,3))
# img = img.resize((512,512))
# img = img_to_array(img)
# print(img.shape)
# img = np.expand_dims(img, axis=0)
# print(img.shape)

# tf.keras.backend.set_session(sess)
# prediction = model.predict(img)


prediction = learn.predict(img)
print(prediction)

response = {
'prediction': {
'Emergency': str(1-prediction[0][0]),
'Non_Emergency': str(prediction[0][0])
}
}
return jsonify(response)
@app.route("/predict", methods=["POST"])
def predict():
message = request.get_json(force=True)
encoded = message['image']
decoded = base64.b64decode(encoded)
img = Image.open(io.BytesIO(decoded))

# img = np.array(img).astype('float')/255
# img = transform.resize(img,(512,512,3))
## img = img.resize((512,512))
## img = img_to_array(img)
# print(img.shape)
# img = np.expand_dims(img, axis=0)
# print(img.shape)

# tf.keras.backend.set_session(sess)
# prediction = model.predict(img)


prediction = np.array(learn.predict(img)[-1])
print(prediction)

response = {
'prediction': {
'Emergency': str(prediction[0]),
'Non_Emergency': str(prediction[1])
}
}
return jsonify(response)

The above code is suitable for a model built on Fast.ai. Although same code could be used for a model built on keras. You just need to import the required libraries and use the commented code for inferencing, giving the correct image size. You might as well need to start a session-

sess = tf.Session()
tf.keras.backend.set_session(sess)
graph = tf.get_default_graph()

and then place the entire code of the predict function within the graph as shown below

def predict():
global sess
global graph
with graph.as_default():
message = request.get_json(force=True)

The following code is from predict.html

<!DOCTYPE html>
<html>
<head>
<title>deeplizard predict image app</title>
<style>
* {
font-size:30px;
}
</style>
</head>
<body>
<input id="image-selector" type="file">
<button id="predict-button">Predict</button>
<p style="font-weight:bold">Predictions</p>
<p>Emergency: <span id="emergency-prediction"></span></p>
<p>Non Emergency: <span id="non_emergency-prediction"></span></p>
<img id="selected-image" src=""/>

<script src="https://code.jquery.com/jquery-3.3.1.min.js"></script>

<script>
let base64Image;
$("#image-selector").change(function() {
let reader = new FileReader();
reader.onload = function(e) {
let dataURL = reader.result;
$('#selected-image').attr("src", dataURL);
base64Image = dataURL.replace(/^data:image\/[a-z]+;base64,/, "");
console.log(base64Image);
}
reader.readAsDataURL($("#image-selector")[0].files[0]);
$("#emergency-prediction").text("");
$("#non_emergency-prediction").text("");
});
$("#predict-button").click(function(event){
let message = {
image: base64Image
}
console.log(message);
$.post("http://localhost:5000/predict", JSON.stringify(message), function(response){
$("#emergency-prediction").text(response.prediction.Emergency);
$("#non_emergency-prediction").text(response.prediction.Non_Emergency);
console.log(response);
});
});
</script>
</body>
</html>

Let’s get the application running

Once the 2 files are ready, place them in a folder. I usually place the html files in a folder named “static” inside the main folder. Through your terminal, cd into the folder and enter the following command to export the flask application

export FLASK_APP=predict_app.py

** for windows, use set instead of export

Then enter the following code to run the application

flask run --host=0.0.0.0

Enter the following url in your web browser to access the application

http://localhost:5000/static/predict.html

** localhost — if it’s on your local machine, ip_address — if it’s on a server

The final application looks like this

All you need to do is browse the image you want to predict and click “Predict”.

BOOM!! the probability scores for the vehicle being an emergency or non emergency is displayed.

And that is how you perform model deployment using Flask!

--

--