Hyperparameter Tuning For XGBoost
Published in
16 min readMay 15, 2022
Grid Search Vs Random Search Vs Bayesian Optimization (Hyperopt)
Grid search, random search, and Bayesian optimization are techniques for machine learning model hyperparameter tuning. This tutorial covers how to tune XGBoost hyperparameters using Python. You will learn
- What are the differences between grid search, random search, and Bayesian optimization?
- How to use grid search cross-validation to tune the hyperparameters for the XGBoost model?
- How to use random search cross-validation to tune the hyperparameters for the XGBoost model?
- How to use Bayesian optimization Hyperopt to tune the hyperparameters for the XGBoost model?
- How to compare the results from grid search, random search, and Bayesian optimization Hyperopt?
Resources for this post:
- Video tutorial for this post on YouTube
- Python code is at the end of the post. Click here for the notebook.
- More video tutorials on hyperparameter tuning
- More blog posts on hyperparameter tuning
Let’s get started!