Why is experiment tracking essential in ML Part 3/3

Prevision.io
Prevision.io
4 min readDec 9, 2021

--

by Mathurin Aché

This blog post series will be composed into 3 steps:

In this post, we are going to answer this question:

  • How can I benefit from the experiment tracking and other advantages included in the Prevision.io platform while continuing to build my experiment outside the platform and / or with third-party solutions?

If you use another environment to train your models and you wish to benefit from the experiment tracking solutions offered by Prevision.io :

  1. You load and prepare data in your environment, on Kaggle notebook or on Google Colab

2. You train one or many models in your environment, on Prevision.io notebooks, on Kaggle notebook or on Google Colab and export them in ONNX format.

For each model listed above, you will see a part which consists of converting the model from the scikit learn format to the ONNX format which is the format expected to be processed in the Prevision.io platform.

3. You upload data models, configuration files using the user interface.

Configuring your external model

Process to import your external model

Configuring your external experiment

For each external model, you need to set a name, a yaml with features configuration,

and a ONNX file containing the model

You can import as many models as you want

To go further, external model import uses the Standardized ONNX Format and most of the standard ML libraries have a module for export.

After few minutes, you obtain a dashboard with all models

Now you can Evaluate your experiment.

External model information

External model feature importance

External model confusion matrix

External model metrics

Good news: Once imported, you still can benefit from the insightful analytics available for internally trained models.

4. You upload data models and the related configuration files using the SDK (Python or R).

5. Once your imported model is deployed, you are able to use it periodically (every hour, every day, every month …).

To proceed to Deployments, I refer you to the paragraph explaining how to deploy an experiment in article 2 of this series or in documentation here https://previsionio.readthedocs.io/fr/latest/studio/deployments/index.html

Conclusion

In this guide, we went through the whole experiment tracking process, while using Prevision.io.

As we have seen, it is essential for a data scientist to document the different iterations over all the data science project stages: from data ingestion, to feature engineering, to model selection, to hyperparameters tuning, while accessing the in depth visual analysis, until the model is deployed and in production.

Originally published at https://content.prevision.io.

--

--

Prevision.io
Prevision.io

Finish strong. Deploying AI can be simple. Introducing the industry’s first AI Management Platform.