TDS Archive

An archive of data science, data analytics, data engineering, machine learning, and artificial intelligence writing from the former Towards Data Science Medium publication.

Member-only story

DEEP LEARNING

Neural Networks with Memory

Understanding RNN, LSTM under 5 minutes

4 min readSep 24, 2020

--

We always heard that Neural Networks (NNs)are inspired by biological neural networks. This huge representation was done in a fantastic way.

Figure 1 shows the anatomy of a single neuron. The central part is called the cell body where the nucleus resides. There are various wires which pass the stimulus to the cell body and few wires which send the output to the other neurons. The thickness of the dendrites implies the weight/bias/power of the stimulus. Many neurons with various cell bodies are stacked up which forms the biological neural network.

Figure 1: Anatomy of Single Neuron ( Source, Edited by author)

This same structure is implemented in Neural Networks. The input is passed through an activation function with weighted edges. The output is generated which can be passed to another activation function. Many activation functions can be stacked up, each of these is called a layer. And in a layer, we can have multiple neurons.

--

--

TDS Archive
TDS Archive

Published in TDS Archive

An archive of data science, data analytics, data engineering, machine learning, and artificial intelligence writing from the former Towards Data Science Medium publication.

Ramya Vidiyala
Ramya Vidiyala

Written by Ramya Vidiyala

Interested in computers and machine learning. Likes to write about it | https://www.linkedin.com/in/ramya-vidiyala/

No responses yet