Parameters Vs Hyperparameters: What is the difference?

Discuss with 4 different examples

Rukshan Pramoditha
Data Science 365

--

Image by Andrew Martin from Pixabay

When training ML and DL models, you have often heard “Parameters” and “Hyperparameters”. In some books and technical documentation, both terms are used interchangeably. However, there is a significant difference that is worth discussing.

Parameters Vs Hyperparameters

Both parameters and hyperparameters are closely associated with model training process. They have two different functions in the training process.

Parameters

Parameters are variables that allow the model to learn the rules from the data. During the training process, they are updated by the algorithm. We do not set optimal values for the parameters. Instead, parameters learn their own values from data. However, in some cases, we need to initialize parameters with some relevant values in the beginning. Once the optimal values for the parameters are found, the model has finished its training. That model is suitable for making predictions on unseen data.

Hyperparameters

Hyperparameters are also variables that control how the model is training. Therefore, they can control the values of parameters. In other words, the optimal values of parameters depend…

--

--

Rukshan Pramoditha
Data Science 365

3,000,000+ Views | BSc in Stats | Top 50 Data Science, AI/ML Technical Writer on Medium | Data Science Masterclass: https://datasciencemasterclass.substack.com