Deploy Your Machine Learning Model on Docker — Part 1
Store your Machine Learning model, expose your model as an API, build a simple interface for API testing, containerise your ML model.
As Data Scientists, our main responsibilities are to process the data, develop and improve machine learning models. The popular belief is that data processing is the most time-consuming step in the entire project and model accuracy is the key to the success of a data product. However, when the industry is on the transition “from the age of discovery to the age of implementation” (AI Superpowers: China, Silicon Valley, and the New World Order — Kai-Fu Lee), the picture has become much bigger and the focus has been shifted away from building a model to serving the model to users and from model’s performance to business values. One well-known example is that Netflix never used the models from the winners of their $1 million prize despite the significant performance-boosting these recommendation engines provide (Netflix Never Used Its $1 Million Algorithm Due To Engineering Costs — WIRED).