Photo by Rob Sheahan on Unsplash

Layers in Neural network

Amit Singh Rathore
Nerd For Tech
Published in
3 min readFeb 17, 2021

--

Layers are a logical collection of Nodes/Neurons. At the highest level, there are three types of layers in every ANN:

Different layers perform different transformations on their inputs, and some layers are better suited for some tasks than others.

CNN — Image
RNN — Time Series
Dense — Multi Classification
Linear — Regression

In the hidden segment, each layer learns different aspects of the data while minimizing the cost function. In this blog, we will discuss some common types of layers and their usage.

Dense or Fully connected layer

The brute force layer of a Machine Learning model

Fully Connected layers in a neural network are those layers where all the inputs from one layer are connected to every activation unit of the next layer. This layer is used to put data in different dimensions. In most popular machine learning models, the last few layers are fully connected layers(Dense) which compiles the data extracted by previous layers to form the final output. Also, it is used for multi-class classification problems.

Batch Normalization Layer

Batch Normalization accelerates convergence by reducing internal covariate shifts inside each batch. The batch norm layer normalizes the incoming activations and outputs a new batch where the mean equals 0 and standard deviation equals 1. It subtracts the mean and divides by the standard deviation of the batch.

Pooling

Pooling layers are methods for reducing high dimensionality. Pooling layers provide an approach for downsampling feature maps by summarizing the presence of features in patches of the feature map.

Max pool
Avg pooling

Drop out

A dropout layer takes the output of the previous layer’s activations and randomly sets a certain fraction (dropout rate) of the activations to 0, canceling or ‘dropping’ them out. It is a common regularization technique used to prevent overfitting in Neural Networks.

CNN

Convolution is the simple application of a filter to an input that results in an activation. Repeated application of the same filter to an input results in a map of activations called a feature map, indicating the locations and strength of a detected feature in input, such as an image.

source

--

--

Amit Singh Rathore
Nerd For Tech

Staff Data Engineer @ Visa — Writes about Cloud | Big Data | ML