What is Aim?
Aim is an open-source tool for AI experiment comparison. With more resources and complex models, more experiments are ran than ever. Aim is used to deeply inspect thousands of hyperparameter-sensitive training runs.
What is XGBoost?
XGBoost is an optimized distributed gradient boosting library designed to be highly efficient, flexible, and portable. XGBoost provides a parallel tree boosting (also known as GBDT, GBM) that solves many data science problems in a fast and accurate way. The same code runs on major distributed environment (Kubernetes, Hadoop, SGE, MPI, Dask) and can solve problems beyond billions of examples.
How to use Aim with XGBoost?
Check out end-to-end Aim integration examples with multiple frameworks here. In this tutorial, we are going to show how to integrate Aim and use AimCallback in your XGBoost code.
As you can see on line 49, AimCallback is imported from
aim.xgboost and passed to
xgb.train as one of the callbacks. Aim session will be opened and closed by the AimCallback and the metrics and hparams stored by XGBoost as well as the system measures will be passed to Aim.
What it looks like?
After running the experiment and running the
aim up command in the
aim_logs directory, Aim UI will be running. When first opened, the dashboard page will come up.
To explore the run, we should:
- Choosing the
- Select the metrics to explore.
- dividing into charts by metrics.
The above steps are shown in the gif below.
This is what the final result looks like.
As easy as that, we can analyze the runs and the system usage.