Forward and Back — Propagation in an ANN- Neural Networks Using TensorFlow 2.0 : Part 2

Nivesh
4 min readApr 5, 2020

This post gives reader a gist of the mathematics behind an Artificial Neural Network. I have broken down the topic into the following contents for your ease. We limited the topics discussed in this post to supervised learning. The entire learning process can be classified into two stages i.e, training and testing.

Contents:

  1. Preliminaries

(a). Perceptron

(b). Activation Functions

(c). Loss Functions

2. Forward Propagation

3. Back Propagation

“Preliminaries” Neural Networks are biologically inspired algorithms for pattern recognition. The other way around, it is a graph with nodes connecting with each other by edges. It’s estimated that 463 exabytes of data will be created each day globally by 2025. An increase in availability of data accentuated the abilities of deep learning.

(a). Perceptron:

A perceptron is the smallest unit of a neural network which transforms the input. It is a mathematical model of biological neurons. It can have one or more inputs with different weights. Each perceptrons are connected by edges that feed the output of one perceptron as input to others.

Fig. 1, illustrates a perceptron model. X1, X2, X3 are inputs ( Features, Independent variables, etc,.). Each input is multiplied with…

--

--