Building microservices on Azure to support use cases in Machine Learning

How we built a microservice architecture that supports use cases in Machine Learning

--

Identification of the right use case

A good implementation of a production level microservice needs a good use case definition. It has to address a specific business problem, and using machine learning should drive up an identifiable metric such as productivity.

Architecture overview

Using Azure for building an ecosystem of microservices

How to manage REST API endpoints

Azure API Management was used to manage policies, authentication and the backend URLs and abstract the key credentials and overhead away from the consumers of the microservice platform.

Managing Machine Learning workloads across environments

In our context, we had environments that spanned across Dev, UAT and Production with the idea that Dev environments were purely for the developer to experiment different use cases. UAT was the staging environment used for testing bugs, and security penetrations also happened on this environment. Production was the final endpoint which was exposed to end users.

Keeping them Agile

We adopted an Agile approach in our delivery framework, which ensured that were regular deliverables given to the machine learning developers. At the end of each sprint cycle, the dev endpoint was supposed to be ready, and changes were not allowed to it further. From here, the artefacts committed for the sprint in the Dev environment will be staged onto the UAT environment and subsequently released live on Production.

--

--