Neural Networks Intuition

Explanation of a simple neural network

Ashish Ohri
4 min readFeb 15, 2019

Overview

Do neural networks amaze you and make you wonder how they work? Well, you are at the right place. In this article you will get the basic understanding of the working of a neural network in the easiest way possible!

Neural networks act in similar ways as the ones in our brain work. The transfer of information takes place in our brains through these networks. Neural networks are models which provide significant amount of accuracy in mostly all the cases when it comes to the task of making your machine learn to perform a task.

Artificial neural networks help us to transfer the information through the network with some mathematical modifications applied to each layer. This information originates from the input layer and ends up at the final layer or output layer. The output depends upon the task we want the neural network to achieve (for e.g. Detection of handwritten digits, calculation of prices of a house in an area, building a bot that can play games, etc.).

A simple neural network with one hidden layer (img src: https://en.wikipedia.org/wiki/Artificial_neural_network)

A neural network consists of an input layer, hidden layers and an output layer as shown in the figure above.

Forward propagation

Feed Forward

The layers in a neural network (shown as layer (n) and layer (n+1) above) contains informative numerical data which helps in predicting the output. Imagine the arrows to be the weights (θ(n)). The weights are one of the most essential part of a neural network. They contain numerical values which when treated with the previous layer (layer (n)), help us to transfer the information to the next layer (layer (n+1)). The weights exist between each pair of layers. They alter the information before it is passed forward. This alteration happens so as to get the desired output. The information starting from the input layer is passed on, with mathematical modification (matrix multiplication of theta with the layers) till we reach the final layer which is the place where the output is captured. The weights are crucial elements of a neural network and they are the components on which the so called training takes place to get the output.

Error Calculation

The output which is received in the final layer is checked against the correct output, that should have been the result. The error is calculated using the cost function. This error is then used to make changes in the model through back propagation.

Backpropagation

Back Propogation

The errors calculated are then used to update the weight layers, propagating this time in the reverse direction. The errors are used to calculate delta for each layer of weight. Delta contains numerical information related to the changes to be made in the values of theta. For the scope of this article, consider delta to just be the differentiation of the cost function. The values of delta are calculated so as to change values of theta in such a way that the final values of the output gets closer to the desired output. This changing of theta values is where the learning happens and this is known as refining of the model.

Putting it together

The above three methods: forward propagation, error calculation and back propagation are repeated again and again. With each repetition, we get closer to the desired output. This can be checked by constantly checking the error (cost function). If we are able to see a constant decrement in error after repetition of the steps, we can conclude that our machine is getting closer to learning the task at hand.

Real World Applications

Google's self driving car (img src: https://www.theverge.com/2015/5/15/8610667/google-self-driving-car-public-testing)

Neural networks are constantly used today in different sectors and are continuously expanding. They are used in face detection, self driving cars, disease predictions, price prediction in stock markets, robotics and what not. Machine learning is a fresh and ripe field to step in and it opens up a wonderful world of infinite possibilities.

--

--