# Overview of Neural Networks

If you’ve heard about Artificial Intelligence, Machine Learning, or Deep Learning recently, then you might have heard of a Neural Network.

Neural Networks are a key piece of some of the most successful machine learning algorithms. The development of neural networks have been key to teaching computers to think and understand the world in the way that humans do. Essentially, a neural network emulates the human brain. Brains cells, or neurons, are connected via synapses. This is abstracted as a graph of nodes (neurons) connected by weighted edges (synapses).

So let’s dive in. What is a neural network? The human brain consists of 100 billion cells called neurons, connected together by synapses. If sufficient synaptic inputs fire to a neuron, that neuron will also fire. We call this process “thinking”. We can model this process by creating a neural network on a computer. A neural network has input and output neurons, which are connected by weighted synapses. The weights affect how much of the forward propagation goes through the neural network. The weights can then be changed during the back propagation — this is the part where the neural network is now learning. This process of forward propagation and backward propagation is conducted iteratively on every piece of data in a training data set. The greater the size of the data set and the greater the variety of data set that there is, the more that the neural network will learn, and the better that the neural network will get at predicting outputs.

Simply put, a neural network is a connected graph with input neurons, output neurons, and weighted edges. Let’s go into detail about some of these components:

1) **Neurons**. A neural network is a graph of neurons. A neuron has inputs and outputs. Similarly, a neural network has inputs and outputs. The inputs and outputs of a neural network are represented by input neurons and output neurons. Input neurons have no predecessor neurons, but do have an output. Similarly, an output neuron has no successor neuron, but does have inputs.

2) **Connections and Weights**. A neural network consists of connections, each connection transferring the output of a neuron to the input of another neuron. Each connection is assigned a weight.

3) **Propagation Function**. The propagation function computes the input of a neuron from the outputs of predecessor neurons. The propagation function is leveraged during the forward propagation stage of training.

4) **Learning Rule**. The learning rule is a function that modifies the weights of the connections. This serves to produce a favored output for a given input for the neural network. The learning rule is leveraged during the backward propagation stage of training.

# Deep Neural Networks

So now that we know what a Neural Network is. What is a Deep Neural Network? A Deep Neural Network simply has more layers than smaller Neural Networks. A smaller Neural Network might have 1–3 layers of neurons. However, a Deep Neural Network (DNN) has more than a few layers of neurons. A DNN might have 20 or 1,000 layers of neurons.

# Conclusion

That’s basically it for a Neural Network. A neural network is just a core architecture. There are different types of neural networks. For example, Convolutional Neural Networks have been very effective for Computer Vision applications. Recurrent Neural Networks are also very popular. If anything here was unclear or you need a broader overview, then feel free to read this Overview of Artificial Intelligence, Machine Learning, and Deep Learning.