Neural Networks Bias And Weights

Understanding The Two Most Important Components

Farhad Malik
FinTechExplained

--

This article aims to provide an overview of what bias and weights are. The weights and bias are possibly the most important concept of a neural network. When the inputs are transmitted between neurons, the weights are applied to the inputs and passed into an activation function along with the bias.

If you want to understand what neural networks are then please read:

What Are Weights?

This is an example neural work with 2 hidden layers and an input and output layer. Each synapse has a weight associated with it.

Weights are the co-efficients of the equation which you are trying to resolve. Negative weights reduce the value of an output.

When a neural network is trained on the training set, it is initialised with a set of weights. These weights are then optimised during the training period and the optimum weights are produced.

--

--

Farhad Malik
FinTechExplained

My personal blog, aiming to explain complex mathematical, financial and technological concepts in simple terms. Contact: FarhadMalik84@googlemail.com