Grid search and random search are outdated. This approach outperforms both.

Ali S
7 min readFeb 8, 2023

If you’re a data scientist, there is a good chance you have used “Grid Search” to fine-tune the hyperparameters of your model. This is a standard approach available in scikit-learn, which takes a set of values for every parameter as a search space, tries all the possible combinations of those parameters and chooses the ones that result in the best cross-validation performance. Or you may have used Random Search, which randomly selects a few of the values in the search space for a predefined number of times.

Computation power and time

Applicable to Grid Search

The main problem with grid search is that it takes a lot of time and computing power. Since it checks every possible combination of settings, it can mean evaluating many models, especially when there are a lot of settings to play with and a wide range of values for each one. This can make finding the best settings drag on and on, making it too slow for some models and data.

We’ve all been there!

Limited and Biased Search

Applicable to both Grid and Random Search

But wait! What if the best hyperparameters for the model are not even in your search space? What if the best max_depth for that XGBoost model is 9, but you searched…

--

--

Ali S

2x Top Writer in Artificial Intelligence | Helping Data Scientists Up Their Game!