An end-to-end example of Aim logger used with XGBoost library

Khazhak Galstyan
May 17 · 2 min read
XGBoost and Aim

What is Aim?

Aim is an open-source tool for AI experiment comparison. With more resources and complex models, more experiments are ran than ever. Aim is used to deeply inspect thousands of hyperparameter-sensitive training runs.

What is XGBoost?

XGBoost is an optimized distributed gradient boosting library designed to be highly efficient, flexible, and portable. XGBoost provides a parallel tree boosting (also known as GBDT, GBM) that solves many data science problems in a fast and accurate way. The same code runs on major distributed environment (Kubernetes, Hadoop, SGE, MPI, Dask) and can solve problems beyond billions of examples.

How to use Aim with XGBoost?

Check out end-to-end Aim integration examples with multiple frameworks here. In this tutorial, we are going to show how to integrate Aim and use AimCallback in your XGBoost code.

As you can see on line 49, AimCallback is imported from aim.xgboost and passed to xgb.train as one of the callbacks. Aim session will be opened and closed by the AimCallback and the metrics and hparams stored by XGBoost as well as the system measures will be passed to Aim.

What it looks like?

After running the experiment and running the aim up command in the aim_logs directory, Aim UI will be running. When first opened, the dashboard page will come up.

Aim UI dashboard page

To explore the run, we should:

  • Choosing the xgboost_test experiment.
  • Select the metrics to explore.
  • dividing into charts by metrics.

The above steps are shown in the gif below.

This is what the final result looks like.

As easy as that, we can analyze the runs and the system usage.

Learn More

If you find Aim useful, support us and star the project on GitHub. Join the Aim community and share more about your use-cases and how we can improve Aim to suit them.


A super-easy way to record, search and compare 1000s of AI training runs.