Artificial Neural Network

Learn AI
5 min readJan 15, 2023

--

A form of machine learning model called an artificial neural network is motivated by the structure and operation of the human brain. Layers of interconnected “neurons” that process and send information make up its structure. The strength of the connection and the information flow between the neurons are indicated by the edges that connect them.

Why ANN:

Traditional Computers

  • Follow Algorithms — key steps/solutions are known
  • Limited Computational power
  • Restricts the problem-solving capability

ANN based Computers

  • Works like a human brain
  • Learns from example
  • Solve unseen real-life problems

Artificial Neuron/Perceptron vs Biological Neuron

Artificial neurons are mathematical models that simulate biological neuron behaviour. They serve as the foundation for artificial neural networks (ANNs) and are used to process and transmit data. In contrast, a biological neuron is a real biological cell found in the human brain.

Here are some key differences between artificial neurons and biological neurons:

  • Structure: While organic neurons are actual cells, artificial neurons are mathematical models. Artificial neurons normally only have one input and one output, but biological neurons have a cell body, dendrites, and an axon.
  • Signalling: While artificial neurons communicate through mathematical calculations, biological neurons do so through electrical and chemical impulses.
  • Learning: Synaptic plasticity is the ability of biological neurons to change their connections with other neurons, which enables the brain to learn from experience. The training of artificial neurons, on the other hand, often involves a dataset and a method called backpropagation.
  • Adaptability: Biological neurons can adapt and change in response to experience, while artificial neurons have a fixed architecture and require human intervention to change.
  • Complexity: Biological neurons are highly complex and dynamic systems, while artificial neurons are relatively simple mathematical models.

Axons: transmit signals to other cells

Cell Body: Sums all the inputs

Dendrite: Receives signals from other neurons

Nucleus: process task

Mathematical Terms in ANN

Synaptic Weights:

  • to give more importance to specific inputs.

Summing Junction:

  • Sums all the signals coming from different neurons.

Activation Function:

  • take the final decision to perform (or not) a specific task.

ANN: Working Algorithm

Finding Equation hyperplane (Training)

Let W0(0) = b = 50, W1(0) = -30, W2(0) = 300

Checking for Unknown values (Testing)

Let W0(0) = b = 50, W1(0) = -30, W2(0) = 300,

X1 = 140, X2 = 17.9 Class = +ve

Backpropagation in Single-Layer Perceptron

Let W0(0) = b = -1230, W1(0) = -30, W2(0) = 300,

X1 = 114 X2 = 15.2 Class = +ve

Multilayer Perceptron Modal

  • Solve complex problems.
  • This consist of an Input layer, one or more hidden layer and an output layer.
  • Training is done using an error back-propagation algorithm

Backpropagation in Multilayer Perceptron Model

ANN advantages and disadvantages.

Some of the main advantages include:

  • High accuracy: ANNs can achieve high levels of accuracy on a wide variety of tasks, such as image and speech recognition, natural language processing, and decision-making.
  • Handling large and complex datasets: ANNs can process large amounts of data and can identify patterns and features in complex datasets.
  • Handling non-linear relationships: ANNs can handle non-linear relationships between inputs and outputs, which allows them to model complex systems.
  • Generalization: ANNs can generalize well from the examples they were trained on to new, unseen examples.
  • Adaptability: ANNs can be adapted to new tasks or new data by adjusting their internal parameters.

However, there are also some disadvantages to ANNs:

  • Computational cost: Training ANNs can require a lot of computational resources and can be time-consuming.
  • Overfitting: ANNs can overfit the training data, which means they perform well on the training data but poorly on new, unseen data.
  • Lack of interpretability: ANNs can be opaque, meaning that it is difficult to understand how they arrived at a particular decision or prediction.
  • Data requirements: ANNs require large amounts of labelled data to train, and the quality of the data can affect their performance.
  • Lack of domain knowledge: ANNs require domain knowledge to be able to properly structure them, otherwise, they may not perform well.

Overall, ANNs are a powerful tool for solving many complex problems, but they are not a silver bullet and have their limitations. They are widely used in many fields and industries, and have led to significant improvements in performance in many tasks and have been used to solve problems that were previously thought to be unsolvable by traditional algorithms.

--

--