One Hidden Layer (Shallow) Neural Network Architecture
Neural Networks and Deep Learning Course: Part 2
In Part 1 of our Neural Networks and Deep Learning Course as introduced here, we’ve discussed the concept of artificial neurons (perceptrons) in neural networks. We also did some calculations inside a perceptron.
However, a single perceptron is not capable enough to model complex non-linear relationships or even simple relationships in the input data. Therefore, multiple perceptrons are stacked together to form a complex structure (network/architecture) called an Artificial Neural Network (ANN) that is capable enough to model any non-linear relationship according to the Universal Approximation Theorem that will be discussed in a separate post.
Key definitions
We’ve just defined what an ANN is. An ANN is also known as a Multi-Layer Perceptron (MPL). This is because a collection of perceptrons are stacked in multiple layers through which the connections between the perceptrons occur. Moreover, an MLP is a Fully (Densely) Connected Neural Network (FCNN) in which every node on each layer is connected to all other nodes on the next layer as in the following diagram. However, nodes within a single layer do not share any connections.