One Hidden Layer (Shallow) Neural Network Architecture

Neural Networks and Deep Learning Course: Part 2

Rukshan Pramoditha
Data Science 365

--

Image by author, made with draw.io

In Part 1 of our Neural Networks and Deep Learning Course as introduced here, we’ve discussed the concept of artificial neurons (perceptrons) in neural networks. We also did some calculations inside a perceptron.

However, a single perceptron is not capable enough to model complex non-linear relationships or even simple relationships in the input data. Therefore, multiple perceptrons are stacked together to form a complex structure (network/architecture) called an Artificial Neural Network (ANN) that is capable enough to model any non-linear relationship according to the Universal Approximation Theorem that will be discussed in a separate post.

Key definitions

We’ve just defined what an ANN is. An ANN is also known as a Multi-Layer Perceptron (MPL). This is because a collection of perceptrons are stacked in multiple layers through which the connections between the perceptrons occur. Moreover, an MLP is a Fully (Densely) Connected Neural Network (FCNN) in which every node on each layer is connected to all other nodes on the next layer as in the following diagram. However, nodes within a single layer do not share any connections.

--

--

Rukshan Pramoditha
Data Science 365

3,000,000+ Views | BSc in Stats | Top 50 Data Science, AI/ML Technical Writer on Medium | Data Science Masterclass: https://datasciencemasterclass.substack.com