Building a Reactjs front-end to interact with a Machine Learning model

Prakharpandey
DeepKlarity
Published in
4 min readOct 8, 2020

This blog is a part of series Clean or Messy Classifier . Check out the entire series for detailed implementation of deploying a machine learning classifier into web app using various frameworks.

Why a web app for machine learning models?

If you have been learning Machine Learning for a while now and have created some really cool models from detecting objects to recognizing voices, then you probably would want to showcase your work and effort. But showing someone your Jupyter Notebook isn’t really what you had in mind. Hence, creating a web app to compute results from your model and displaying it to the end-user is a great way to give your ML project a boost.

In this article, following approaches for displaying a model’s prediction in front-end are discussed.
1) Model is loaded in a backend server.
2) Model is loaded as a tfjs model in frontend itself. Hence, getting rid of backend server.

About the Model

For this article, we are using a fastai classifier model for prediction using a backend server. While for predicting results using frontend only, a Keras model converted to TFJS model is used. The model used is a simple image classifier that predicts the cleanliness of the surroundings in the image.

About the Frontend

The frontend is divided into three components discussed below:-
1) Uploading an image to the server.
2) Taking an image in real-time using the device’s camera and uploading it to the server.
3) Taking an image in real-time and predict using TFJS model loaded in frontend itself.
For creating the frontend, we are using React which is an open-source, JavaScript library for building user interfaces. If you are new to React you can refer to this link for setting up with react and getting started.

By uploading an image to the server

To implement the idea of using a Machine learning model with a frontend webpage to display results, our first approach was to upload an image in the client-side and make an Axios post to fetch the results from the backend server where the model is loaded.
Making an Axios post request:-

const { apiUrl } = window[‘config’]
axios.post(apiUrl, fd)
.then(res => {
...
})

‘apiUrl’ is a key imported from config.js holding the URL of the backend as value to pass in Axios. In this case, we used a deployed backend API server but a localhost server can also be used temporarily, just make sure that the localhost is running.

By taking an image in real-time using the device's camera and uploading it to the server

For this approach, we had our image classifier model loaded in a backend server. To work with a camera functionality we made a component, ‘camera’. we used react-html5-camera-photo to add a camera feature to the webpage.
Note- If you are following the above link just know that this saves the image as a base64 DataURI.

return (
<div>
{
(dataUri)
? <ImagePreview dataUri={dataUri}
isFullscreen={isFullscreen}
/>
: <Camera
onTakePhoto={(dataUri) => { handleTakePhoto(dataUri); }}
/>
}
<div><h3>The room is clean: {clean}</h3></div>
<div><h3>Confidence: {confidence}</h3></div>
</div>
)

Axios post is similar as mentioned in the above approach.
In the backend server, we dealt with both, an image and a DataURI while building this project. So you can refer to this link to know about how to handle such events in the backend.

Taking an image in real-time and predict using TFJS model loaded in frontend itself.

After setting up the camera and predicting using a backend server, we noticed a delay in fetching the response from the backend. Now the idea was to get rid of the API server and load the model into frontend itself. This is where tensorflowjs(TFJS) was really handy. TFJS is a library for machine learning in JavaScript through which ML model can be used directly in browser or node. We converted our Keras model to TFJS using the following piece of code in the terminal,
tensorflowjs_converter --input_format keras \
path/to/my_model.h5 \
path/to/tfjs_target_dir

Loading the model to React:-

const model = await tf.loadLayersModel(‘/tfjs/model.json’)

Note- The function ‘loadLayersModel’ accepts a URL for the model as an argument but in our case, we had our model in our local machine. So to load a tfjs model from your machine make sure it is saved inside the public directory.

This is how the end result was displayed in the frontend when used tfjs model:-

The code for this project is available here.

If you have any questions and ideas about what else could have been added or improved in the project please share.

--

--