How to Find The Best Hyperparameter For any Model

Rina Mondal
3 min readDec 20, 2023

--

Hyperparameter Tuning is a very common term used in Data Science realms. It is a critical step in the process of developing machine learning models, It is choices about the algorithm that we set rather than learn. In other words, these are not necessarily learned from the training data, instead these are choices about the algorithm that we make ahead of time.

We have to try different hyperparameters for the problem and figure out which one works the best. There is no way to learn them directly from the data.

Visit me to know about types of hyperparameters.

The ways to choose Hyperparameters:

  1. Choose Hyperparameters that work best on the Dataset: This is not a good idea. It may produce perfect result for training but not for testing.
  2. Split the Data into train and test: Choose hyperparameters that work best on train and test Dataset.
  3. Split the Data into training, validation and test Dataset: We train our algorithm with many choices of hyperparameters on the training set, evaluate on validation set and choosing hyperparameters which performs best on the validation set. It is a preferable way of choosing hyperparameters.
Choosing Hyperparameters

4. Cross Validation : In this method, we are going to take out the data set, hold out test set to use at the very end. For the rest of the data, rather than splitting it into simple training and validation partition instead, we can split our data into different folds, only one fold is used for evaluate and others are used for training. Now in a cycled way, we are choosing which fold is going to be the validation set.

Cross validation

In this example we are choosing five fold cross validation, so you will train your algorithm with one set of hyperparameters on the first four folds, evaluate the performance on the fold five. Again, now go and retrain your algorithm on one, two, three, five and evaluate on fold four and cycle through all different folds. It is pretty efficient for small datasets but for large datasets, it is not so used as time complexity is more.

Hyperparameter tuning helps in improving Model Performance and avoidance of Overfitting to an extent.

Difference between training, validation and test dataset.

Explore Data Science Roadmap.

Visit my YouTube Channel where I explain Data Science topics for free.

Give it :👏👏👏👏:
If you found this guide helpful , why not show some love? Give it a Clap 👏, and if you have questions or topics you’d like to explore further, drop a comment 💬 below 👇

--

--

Rina Mondal

I have an 8 years of experience and I always enjoyed writing articles. If you appreciate my hard work, please follow me, then only I can continue my passion.