featurepreneur
Published in

featurepreneur

HyperParameter Tuning and Its Types

Machine learning is the way you predict future values based on historical data. There is a list of different machine learning models. They all are different in some way or the other, but what makes them different is nothing but input parameters for the model. These input parameters are named Hyperparameters.

Hyperparameter tuning is the process of tuning the parameters present as the tuples while we build machine learning models. These parameters are defined by us which can be manipulated according to the programmer's wish. Models can have many hyperparameters and finding the best combination of parameters can be treated as a search problem. There are various methods of tuning

  • Grid SearchCV
  • Random SearchCV
  • Bayesian Search
  • Evolution Search

Grid SearchCV

Grid Search is an effective method for adjusting the parameters in supervised learning and is used to improve the generalization performance of a model. With Grid Search, we try all possible combinations of the parameters of interest and find the best ones. It is great for spot-checking combinations that are known to perform well generally. If you increase the number of combinations then time complexity will increase. So, It’s not much suited for large combinations of parameters. You can get the code implementation here.

Random SearchCV:

Random Search defines a search space as a bounded domain of hyperparameter values and randomly sample points in that domain. It is great for discovery and getting hyperparameter combinations that you would not have guessed intuitively, although it often requires more time to execute. However, if the number of parameters to consider is particularly high and the magnitudes of influence are imbalanced, the better choice is to use the Random Search. You can get the code implementation here.

Bayesian Search

Bayesian optimization builds a probabilistic model of the function mapping hyperparameter values to the objective function evaluated on a validation set. It computes the conditional probability of the objective function value for a validation set given a set of values of hyperparameters used to train the machine learning model which is the probability of objective function value and hyperparameter validation set. It's a very advanced search algorithm, that's best suited for large parameter combinations. Bayesian Optimization provides a probabilistically principled method for global optimization. You can get the code implementation here.

Evolutionary Search:

Genetic algorithms provide a powerful technique for hyperparameter tuning, but they are quite often overlooked genetic algorithms that make use of basic concepts from evolution by natural selection to optimize arbitrary functions.It basically works on Charles Darwins Theory. So, Parameters are equivalent to genes in biological systems. Its works on three major steps Mutation, Recombination, and Replacement. These steps are recursively done until the convergence of the parameter.

Conclusion:

These are types of hyperparameter tuning so far, Every algorithm has its own pros and cons. Whereas choosing wisely for your requirement would be great. The below image gives you a clear understanding of the performance benchmarking of these algorithms.

--

--

Get the Medium app

A button that says 'Download on the App Store', and if clicked it will lead you to the iOS App store
A button that says 'Get it on, Google Play', and if clicked it will lead you to the Google Play store