ML Ops in Machine Learning — Model Containerizing for MLOps

Sreeprakash Neelakantan
3 min readNov 9, 2021

--

The term MLOps is often used in context with the operations and maintenance of machine learning models. MLOps is a set of practices that aims to deploy and maintain machine learning models in production reliably. The prime objective of MLOps is to ensure that models are always ready for production, by automating model deployment, monitoring the model’s accuracy and optimizing parameters. This ensures that the models are accurate enough for their designated use case and provide the desired ROI.

In order to achieve these objectives, one must be aware of a few basics. One should know how to prepare datasets for training, be able to configure a pipeline for building a model from scratch or tuning an existing one, know how to monitor accuracy metrics and have a strategy in place to deal with outlier predictions.

I am ready!

Here is my little demonstration on how to train and model, save it as a binary object, and then containerize it for production use cases. Such a container will have the algorithm’s coefficient baked in, such that the computational load and time required to train the model are eliminated in production pipelines.

I am directly diving into the code and my quick walkthrough assuming that you already have the needed rudimentary understanding of Docker containers, Python libraries for Machine Learning.

Training Phase: train-save.py

Create a file called train-save.py that will complete the training phase as save the training object as a file. I already have the needed libraries in a folder just above this installed using Python’s pip. I am not using Anaconda in this case to have a completely independent and portable setup.

The above code when executed will save the model object in a folder called data

Load and Test the Model: load.py

Let us now make another little script test the saved model.

Test the script in command line by specifying the Iris flower features.

Ready to Containerize

Dockerfile

Build the Docker Image

Test the Image

Such an image gets pushed to the registry for production use cases. Though this is a very tiny demonstration, this is something that can be scaled and adapted to any use case. Happy automation!

--

--