Are Artificial Neural Networks the Holy Grail?

Cami Rosso
3 min readMar 29, 2016

--

What separates man from the machine today is cognition and learning. Scientists and researchers are conducting studies in effort to understand biological cognition in order to adapt it artificially for economic and commercial gains. This is a painstaking quest. The true biomechanics of cognition and learning as it relates to the brain’s neurons and synapses is a process that is not fully understood. There are hypothesized ways in which pioneering researchers are currently attempting to decode neuronal activities via fMRI imaging, DNA tagging and other innovative scientific methodologies.

Despite this lack of understanding, exciting progress is being made in Artificial Neural Networks (ANNs). Google’s AlphaGo program has recently demonstrated that a learning algorithm for the game of GO can rival the best human player. The ANN breakthrough is what distinguishes Google’s AlphaGo from the chess-playing Deep Blue of IBM that defeated world champion Garry Kasparov in 1997. Deep Blue deployed an evaluation function algorithm, heuristic search, and parallel processing capabilities, but was not an ANN.

Human cognition is fuzzy and fast. It’s non-linear, adaptive and parallel. Take visual processing as an example. Our brains are able to quickly generalize and identify a cat without having to know every species of cat beforehand.

Like the human brain, ANNs are parallel and deploy a network of nodes (neurons). The learning occurs when adaptive weights of the connections are updated based on experience. A mathematical function determines the activation of the neuron. The more neural layers in the ANN, the greater the capability for processing complexity. ANN training may be self-organizing or via back-propagation.

ANN technology is suited for problems that are complex where the data is not complete. Many speech, facial, object and handwriting recognition technologies use the ANN architecture. For example, Android speech recognition uses Google Brain, a cloud computing network of 16k computers and 100 billion connections, in order to send the answers to customer queries by mobile technology. Apple pioneered the commercialization of handwriting recognition with its neural network-based recognition system that was used in second generation Apple Newton OS devices.

ANNs can accelerate the rise of the Internet of Things (IoT). IoT are sensor-based devices, which is the first generation of smart devices. As ANN technology becomes more developed, future generations of IoT may have recognition software coupled with learning programs, all hosted and processed via cloud computing. ANNs will effectively transform smart devices to intelligent learning networks.

Today, in the fields of artificial intelligence (AI), statistics and cognitive psychology, ANNs are in the budding stages; there is a long way ahead before the achievement of true mimicry of biological neural networks. As neuroscience decodes the mysteries of how the brain works, advances in artificial neural networks will blur the separation of man from machine.

Copyright © 2016 Cami Rosso All rights reserved.

Originally published at https://www.linkedin.com on March 29, 2016.

--

--