Member-only story
Classification of Neural Network Hyperparameters
By network structure, learning and optimization, and regularization effect
A major challenge when working with DL algorithms is setting and controlling hyperparameter values. This is technically called hyperparameter tuning or hyperparameter optimization.
Hyperparameters control many aspects of DL algorithms.
- They can decide the time and computational cost of running the algorithm.
- They can define the structure of the neural network model
- They affect the model’s prediction accuracy and generalization capability.
In other words, hyperparameters control the behavior and structure of the neural network models. So, it is really important to learn more about what each hyperparameter does in a neural network with a proper classification (see the chart).
Important facts about hyperparameters
Before introducing and classifying neural network hyperparameters, I want to list down the following important facts about hyperparameters. To learn the differences between the parameters and hyperparameters in detail with examples, read my “Parameters Vs Hyperparameters: What is the difference?” article.