4 Hyper-Parameter Tuning Techniques and Limitations

Popular hyper-parameter tuning techniques that every Data Scientist should know

Sivasai Yadav Mudugandla
The Startup

--

Image from SigOpt

Introduction

Wikipedia states that “Hyperparameter optimization or tuning is the problem of choosing a set of optimal hyperparameters for a learning algorithm”

One of the most challenging parts in ML workflow is finding the best hyperparameters for the model. Performance of ML models is directly related to Hyper-parameter’s. The more you tune the hyperparameters, the better model you get. Tuning Hyper-parameters could be tedious, complicated and is more of an art than science.

Hyper-parameters

Hyper-parameters are the parameters used to control the behaviour of the algorithm while building the model. These parameters cannot be learned from the normal training process. They need to be assigned before training of the model.

A sample list of Hyper-parameters by Dr.Mukesh Rao

Table of Content

  1. Traditional or Manual Tuning
  2. Grid Search
  3. Random Search

--

--

Sivasai Yadav Mudugandla
The Startup

Sr. Data Analyst at CITI Bank| London | Ex — Data scientist | Post Graduate in AI & ML | Pythonista | https://www.linkedin.com/in/sivasai-mudugandla-89a156104/