The Idea Behind a ‘Neural Network’

Sriya Roy
Warwick Artificial Intelligence
3 min readDec 2, 2021

--

When you make a decision, what goes through your head? A bunch of electrical pulses is certainly an answer to this. A quick google search defines ‘decision’ as

A conclusion or resolution reached after consideration.

Artificial Neural Networks (ANN’s) are useful for making decisions based on what we already know. Right now (I hope) you are deciding what each word you are reading is based on its constituent letters and what they are most likely to be given your visual acuity. You are very clever! Okay then how do you do this, how does your brain decide? Let’s start with a simpler example.

If you were asked to decide whether to read this article or not, you would probably weigh up a number of factors in your head such as Interest in the topic (a1), having read it before (a2), and being tired (a3).

Making this decision may happen with or without any input of your conscious brain. Examples include carefully considering distance, wind speed, heart rate, etc. for a sniper to shoot, deciding when to blink, or a complex (mainly) unconscious decision still baffling scientists to this day: who to fall in love with. One thing these decisions all have in common are what we will call the decision making parameters.

Assigning a value closer to 1 for each (a) that contributes to reading this article and closer to −1 if the opposite is true gives us a good starting point for defining the decision.

  • I am very interested in this topic (a1) = 0.9
  • I have read it many times before (a2) = −0.6
  • I am fairly alert (a3) = 0.3

Adding my (a)’s gives a total of 0.6. A simple function called a perceptron is used to make sense of this number and is defined by deciding 1 (yes) for values greater than 0, and 0 (no) otherwise. This reveals I would read this article.

Naturally, priority for each (a) is completely dependent on whom you ask. Two people who have an interest in this section, who have both never read it before, yet have been awake for sixteen hours may make different decisions! We clearly need more detail.

Assigning a weight (w) to each parameter gives us the more personal touch we require. As coffee remedies most of my tiredness, I might assign the weight (w3) of alertness a low value of 0.5, set (w2) to 5, and (w1) to 3. We now multiply these weights with their respective decision making parameters giving

(a1)(w1) + (a2)(w2) + (a3)(w3) = -0.15,

changing the decision to a no!

The output not being my actual decision suggests the model is biased. To improve the model, we could reduce the minimum value at which the perceptron outputs a 1. Seeing as this is my article, I am more likely to read it so we set the new minimum value at which I still read the article to −3. For convenience, we simply subtract this bias from the other side of the weighted sum and call it (b), this keeps the perceptron ’jump’ at zero which is useful for manipulation later down the line. The addition of bias clearly changes my decision once again to a yes. Here’s a diagram depicting the decision going into the perception.

ANN’s can use this idea to decide multiple things simultaneously based on the same parameters by just using different weights (w), they can then use those decisions to make even higher level decisions: if I read this article, do I have time to do my homework? This gives us the classical cool-looking diagram of a neural network:

I decided my parameters, but in practice, they are learned using calculus. This is for another article.

Using your own decision-making function, would you read this article again?

This article was written by Connor Mattinson, a former President of Warwick AI on 20th December 2020

--

--