# Build Your First Neural Network From Scratch

## Let’s teach AI to do something simple

I have been wanting to play with neural networks for a very long time and finally I found a window of opportunity to mess around with neural nets. It’s pretty far away from Skynet, and I don’t think that I fully grasped the math behind this, but let’s teach AI to do something simple first.

# Theory Behind the Code

Neural networks are not a new concept. They were first introduced by Warren McCulloch and Walter Pitts in 1943.

We are going to build a single-layer neural net without hidden layers or a perceptron. It will consist of an input layer with training examples, synapses or weights, and neurons, and an output layer with correct answers. This is a graphical representation of the neural net:

Also, we need an understanding of some math concepts such as sigmoid and derivatives to know how the neurons are learning. But what the neurons are doing is simply taking an input value, multiplying by synapse weight. Then we sum all these multiplications and use the sigmoid function to get output in the range between 0 and 1.

Neuron representation:

Sigmoid function:

# Problem Definition

We have sequences of numbers on the input layer. And we want the neural network to return 1 if the first number of the input data set sample is 1 and return 0 if the first number is 0. This is represented in the output layer. And this is how the problem set looks:

# Prerequisites

I hope we develop some level of conceptual understanding and we can start coding.

We are going to use Python and NumPY library.

NumPY installation:

*pip install numpy*

If it’s installed successfully, we can proceed to the coding part. First what we need to do is to import NumPy to the Python file:

*import numpy as np*

Now we are ready to describe and train our neural network.

# Training

First, let’s create a sigmoid function:

Next, we will define training examples, inputs (which are in a four-by-five matrix), and output:

Next, we need to initialize synaptic weights by generating random values and put them in the shape of a four-by-one matrix:

Next, let’s start building a training model. We will use a `for`

** **loop, and all training will happen inside this loop. We will invoke the sigmoid function and pass the sum of all inputs multiplied by sigmoid weights. `Np.dot`

** **will do matrix multiplication for us. Here is what we have:

And what we have as output:

Now we are ready to do the training. We are going to do it by calculating the difference between the output we have and the actual output of the sigmoid function. Then we can adjust our weight accordingly to the severity of the error. We are going to repeat this several times, 10,000 times, for example.

Let’s define the sigmoid derivative:

And this is how we calculate and adjust weights:

Let’s start learning and see how the results will be affected by the length of the learning process. We will start with 100 iterations:

Not bad for the start — our AI learned how to recognize patterns, but the percentage of errors is still pretty high. Now let’s do 1,000 iterations:

Much better, but let’s keep going and do 10,000 iterations:

And finally 100,000 iterations:

We can keep up with adding more learning iterations, but we will never get to 100% accuracy because that would require an infinite amount of computation. But even for the worst case, we have a probable accuracy of 99.77%, which is pretty good.

For the final code, I’ve made it look nice and separated it by functions. I also added a very sophisticated method of storing weights in a text file. This allows us to do the learning process once, and when we need to use our AI, we can just import weights and use the sigmoid function.

# Conclusion

Our first artificial intelligence is ready for production. Even though it can only recognize very simple patterns on a very small data set, now we can expand it and, for example, try to teach the AI to recognize something in pictures. I will write about further development in the next article.

Keep learning, keep growing!