Asides from Kylie Jenner, TikTok videos and Tesla, people are searching for these new terms given in the graph and want to learn more about. Even, you may have come here yet you’ve searched any of these terms.
“AI is the new electricity.” — Andrew Ng
Artificial Neural Networks (ANN) are inspired by the human brain and are built to simulate the interconnected processes that humans do: thinking.
Basically, an ANN consists of the following components:
- An input layer
- A hidden layer w/Activation Function
- An output layer
- Weights between the layers
Best way to understand neural networks is to build one from scratch, without using any deep learning library. In this article, we will be exactly doing this, with Python.
“Talk is cheap. Show me the code.” — Linus Torvalds
So, we have above table. As we humans, we directly understand the output is always equal to the first column value of input section.
We will create and train an artificial neural network, and it will predict the output value for given new input value, as provided in the last row.
Let’s create our neural network class. Yes, we said we will not use any deep learning library, but we will use numpy for calculations. You can install with:
pip install numpy
We have created the class and created randomly generated synaptic weights.
The activation function used in our model will be sigmoid. This function can map any value to a value from 0 to 1. We will use this function to normalize the weighted sum of the inputs.
The think() function will take an array as input, and will return the dot product of input and the weight matrix.
Now we are ready to train our artificial neural network to make an accurate prediction. Every input will have a weight, which may be either positive or negative, where we assigned randomly in the beginning. These weights needs to be changed, cannot be static values. Otherwise it will not be a “learning”. So, how we will change these values?
The answer is; Back-Propagation.
We will predict from train set input, calculate the error since we know the result, do the adjustments, and do all these 3 steps over and over again (about 10.000 times) to fine tune weights.
Now we are ready to sail towards the unknown!
This is the output when we ran the code:
> python main.py
Weights in the Beginning:
[[ 0.85619597]
[ 0.44821289]
[-0.89561481]
[ 0.39085635]]
Weights After Training:
[[ 8.96880175]
[-2.94541369]
[-2.30839168]
[-4.3796263 ]]
The answer for [1 0 1 0] is: 1> python main.py
Weights in the Beginning:
[[ 0.13915639]
[-0.50739333]
[ 0.13890921]
[ 0.80830971]]
Weights After Training:
[[ 8.94700038]
[-3.61970704]
[-1.65339981]
[-4.36980354]]
The answer for [0 0 1 0] is: 0
Here you can find the entire code for this simple artificial neural network.
Thanks for reading!