What is Loss in Neural Nets? Is cost function and loss function are same ?

vinodhkumar baskaran
3 min readApr 15, 2020

--

Let Demystify it. Come on!!!

courtesy- here

As refresher let’s talk on neural network and its training.Let’s be super quick ...

Training process begins now …..1,,,…2,,,…3

  • Let Assume we have D={ xi’s , yi’s }
  • Next need to initialize weights (Wij) randomly
  • For each xi’s in D dataset

1) Pass xi forward through the network. → Forward Prop

2) Compute the loss.[yi -y^i]

3) Compute all the derivative(Gradient) using chain rule and memoization

4) Update the weight (Wij) from end of the network to the start. → BackProp

  • Repeat for loop till convergence. Meaning till the weights ,Wij new approximately or equal to Wij old.

“Note : The weights are modified/updated using a function called Optimization Function/Optimizers”

Was that the quick refresher ?

Now, let us discuss on the loss and loss function in neural network.

Let’s say you are on the top of a hill and your task is to climb down the hill.How will you do that ?

Should you jump or Should you find the possible path that leads smoothly to the ground earth. If you chose option one then sorry you are Busted!!!

If your choice is 2 , then that’s your answer for loss function.Yes believe me you have answered…

Gradient descent

Technically speaking,

Loss’ helps us to understand how much the predicted value differ from actual value

Function used to calculate the loss is called as “Loss function”

Loss function is a method of evaluating “how well your algorithm models your dataset”. If your predictions are totally off, your loss function will output a higher number. If they’re pretty good, it’ll output a lower number. As you tune your algorithm to try and improve your model, your loss function will tell you if you’re improving or not.

“Loss functions are helpful to train a neural network”

Is Loss function and cost function are same ?

Well “Yes” but Actually “No”

Yes , cost function and loss function are synonymous and used interchangeably but they are “different.

A loss function/error function is for a single training example/input. A cost function, on the other hand, is the average loss over the entire training dataset.

The optimization strategies aim at “minimizing the cost function”.

Hope your clear with the terminology…

Various type of loss function is discussed here.Have a look on that as well for better understanding.

Thanks for reading!!! :-)

--

--

vinodhkumar baskaran

Experienced data practitioner skilled in Python, ML, statistics, and NLP. Proactive problem solver, exceeding expectations with a positive attitude.