Photo by Universal Eye on Unsplash

Top Deep Learning Algorithms

Aditya Kumawat
Nerd For Tech
Published in
3 min readOct 14, 2021

--

There are a lots of Deep Learning Algorithms that are used, but what are some of the best? Let’s take a look!!

I have already discussed the most basic of Neural Networks or the Perceptron here. If you want to know how a neural network would function in its barest form, please check that out.

How deep learning works?

Deep learning algorithms use ANNs (short for Artificial Neural Networks) to duplicate the functioning of our brain. Why though? Not trying to be philosophical or historical, but our brain is the most complicated piece of machinery we know. Due to this, even if we can replicate the ‘brain-power’ the number of possibilities is unfathomable.

How an algorithm processes and computes the output is of course different from another algorithm. For example, in one type of algorithm MLPs, the data is processed by the stacking the basic form of neural networks (also called a perceptron) on top of each other and at different levels. The data is divided between these sections, processed, and an output or multiple outputs are generated.

Types

  • Convolutional Neural Networks (CNNs)
  • Recurrent Neural Networks (RNNs)
  • Long Short-Term Memory Networks (LSTMs)
  • Multilayer Perceptrons (MLPs)
  • Radial Basis Function Networks (RBFNs)
  • Generative Adversarial Networks (GANs)
  • Deep Belief Networks (DBNs)
  • Self-Organizing Maps (SOMs)
  • Autoencoders
  • Restricted Boltzmann Machines (RBMs)

Here is the basic description of how each of these algorithm works:

Convolutional Neural Networks

Working: They have multiplayer layers that process the features from the data.

These networks are mostly used for detecting objects and processing images.

Important layers that carry out the output:

  • Convolutional Layer: The layer has multiple filtering operations that carries out complex operations.
  • Rectified Linear Unit: Also called ReLU for short. This unit applies some operations on the elements and outputs a rectified feature map.
  • Pooling Layer: Output of ReLU goes into this Pooling Layer. Pooling is done to reduce the dimensionality of the feature map.
    The pooling layer outputs a single long vector from the 2-D output of ReLU.

If we are using CNNs for image classification, for example, then in the end, after pooling is done, the vector from pooling layer goes into a feed forward network or a fully connected layer which will classify the images.

Recurrent Neural Networks

RNNs feeds their out back as inputs. Due to this property, RNNs can remember the previous inputs and due to that, they can perform quite powerful operations on the data.

Due to the property of feeding itself, an RNN can be unfolded over time. Meaning that that the output of the RNN at time t1 will different than the output at t2.

Used for:

  • Image Captioning
  • NLP (Natural Language Processing)
  • Time Series Analysis
  • Language Translation

RNNs were one of the biggest breakthroughs to do Natural Language Processing.

Long Short-Term Memory Networks (LSTMs)

These are a type of RNNs that has the default behaviour of recalling past information (sounds human enough!) for quite long periods.

The use cases of LSTMs are very similar to RNNs but more specifically, they can be used for music composition, speech recognition, etc.

Working of an LSTM is carried out in three steps:

  • Forget the unnecessary information of the past state.
  • Update the current cell values.
  • Output the certain state of the cell.

Mutlilayer Perceptrons (MLPs)

If you know about the perceptron, then you would have guessed how this algorithm work just from reading the title.

They contain various perceptrons with activation functions stacked vertically and connected horizontally.

MLPs are used for speech-recognition, translation, and speech-recognition.

So, those were the most popular neural network algorithms. If you want to know more in detail about the networks described above, and the others, then here are the links:

Thanks so much for reading!!! Follow me on Twitter, and connect to me on Linkedin to help me expand my network!

--

--