Neuromorphic Computing

QuAIL Technologies
QuAIL Technologies
Published in
6 min readJan 27, 2023
https://www.buzzwrd.me/index.php/2021/02/24/neuromorphic-computing-a-promising-branch-of-computing/

History of Neuromorphic Computing

Neuromorphic computing is a form of artificial intelligence that seeks to replicate the workings of the human brain. It has been an area of research for decades, with breakthroughs occurring as early as 1936 when mathematician and computer scientist Alan Turing created a mathematical proof that a computer could perform any mathematical computation if it were provided in the form of an algorithm. In 1948, Turing wrote “Intelligent Machinery,” which described a cognitive modeling machine based on human neurons. This was followed by Canadian psychologist Donald Hebb’s breakthrough in neuroscience in 1949 when he theorized a correlation between synaptic plasticity and learning.

In 1950, Turing developed the Turing Test, which is still considered the standard test for AGI (Artificial General Intelligence). The U.S. Navy then created the perceptron in 1958 to use it for image recognition, but due to limited knowledge about how brains work at this time, it failed to deliver its intended functionality. Nevertheless, this is considered one of the predecessors of neuromorphic computing.

The development of neuromorphic computing as we know it today began in 1981 when Caltech professor Carver Mead created analog silicon retina and cochlea devices inspired by neural paradigms. Since then, there have been numerous advances, such as Henry Markram’s Human Brain Project, launched in 2013 with 500 scientists from 140 universities across Europe working on it. IBM’s TrueNorth chip was released in 2014, and Intel’s Loihi chip was released in 2018 which has had applications in robotics as well as gesture and smell recognition. These developments show just how far neuromorphic computing has come in the last 80 years.

What is Neuromorphic Computing?

Neuromorphic computing is a field of computer engineering that seeks to replicate the structures and processes of the human brain and nervous system in order to create more efficient, adaptive, and energy-efficient computing systems. It draws from several disciplines, such as computer science, biology, mathematics, electronic engineering, and physics to design both hardware and software elements. Neurons and synapses are typically modeled after biological structures since they are considered the fundamental units of the brain.

Currently, there are no real-world applications for neuromorphic computing, but various groups, including universities, military organizations, and technology companies, have research underway. Potential uses include deep learning applications, next-generation semiconductors, transistor accelerators, autonomous systems like robotics or self-driving cars, and Artificial General Intelligence (AGI). Additionally, it is expected that neuromorphic processors could provide a way around Moore’s Law limitations.

The most common form of neuromorphic hardware is the spiking neural network (SNN), which consists of nodes or spiking neurons that process and hold data like biological neurons connected by artificial synaptic devices. Synaptic devices use analog circuitry to transfer electrical signals that mimic brain signals. The high-performance computing architecture and functionality used in neuromorphic computers differ from the standard computer hardware found in von Neumann computers, which is the computer architecture of most modern computers. Von Neumann computers have separate processing and memory units with binary values, leading to speed and energy issues due to having to move data between locations. On the other hand, neuromorphic chips collocate processing and memory together on each neuron, simultaneously allowing for high performance and low energy consumption.

Neuromorphic computers are characterized by the following:

  • Collocated processing and memory on each neuron
  • Massively parallel functioning with up to one million neurons operating simultaneously
  • Scalability achieved by adding more neuromorphic chips
  • Event-driven computation, where only a small portion of neurons processing electrical spikes use energy
  • High adaptability and plasticity allow them to learn quickly
  • Fault tolerance due to information being held in multiple places.

These characteristics make neuromorphic computers highly efficient at performing tasks such as machine learning algorithms or solving novel problems without suffering from a von Neumann bottleneck. They can also be used for low-power applications such as robotics or autonomous vehicles since they require much less energy than traditional computers.

Fig. 1: Comparison of the von Neumann architecture with the neuromorphic architecture.

Overall, neuromorphic computing is a promising technology that offers many advantages over traditional computers, such as increased performance, scalability, efficiency, flexibility, and fault tolerance, making it an attractive option for AI applications where speed is essential or where low power consumption is necessary such as mobile devices or IoT applications.

Challenges

Neuromorphic computing has the potential to drastically increase the capabilities of computer science applications such as Artificial Intelligence as well as reveal insights into cognition. However, it is still in the early stages of development and faces several challenges. One such challenge is accuracy. While neuromorphic computers are more energy efficient for deep learning and machine learning-focused tasks when compared to graphics processing units (GPUs) or TPUs, they have yet to prove themselves more accurate than them. This leads many to prefer traditional software due to its higher accuracy rate and more familiar architecture. Another challenge is limited software and algorithms. Most neuromorphic research is still conducted with standard deep-learning software and algorithms developed for von Neumann hardware. This limits research results to traditional methods that neuromorphic computing is trying to evolve beyond.

In addition, neuromorphic computers are not widely available due to a lack of application programming interfaces, programming models, and languages for non-experts. There are no clearly defined benchmarks for performance or common challenge problems, making it difficult to assess the performance of neuromorphic computers or prove efficacy. Finally, neuromorphic computers are limited by what we know about human cognition, which may be incomplete if quantum computation plays a role in cognition, as has been theorized by several notable experts.

“If cognition requires quantum computation as opposed to standard computation, neuromorphic computers would be incomplete approximations of the human brain and might need to incorporate technologies from fields like probabilistic and quantum computing.”

While this technology has great potential, much work must be done before it can reach its full potential. If these challenges can be overcome, then we could see an unprecedented level of AI capability that could also help us better understand our own cognitive processes.

Conclusion

The future of neuromorphic computing looks very promising. Recent progress in neuromorphic research is attributed to the widespread use of AI, machine learning, neural networks, and deep neural network architectures in consumer and enterprise technology. It can also be attributed to the perceived end of Moore’s law among many IT experts. As a result, chip manufacturers are looking for new ways to improve efficiency, with some turning their attention toward neuromorphic computing systems. Furthermore, the increased energy efficiency associated with neuromorphic computing presents an attractive alternative to using classical computer architectures to train large AI models, which can have a substantial carbon footprint (see AIs Carbon Footprint).

The potential applications for this technology are vast, and it is clear that it has a bright future. Neuromorphic computing promises to revolutionize our lives with its efficient processing power while helping us explore space, diagnose diseases earlier than ever, create more intelligent machines, and potentially much more.

For additional resources, visit www.quantumai.dev/resources

We encourage you to do your own research.

The information provided is intended solely for educational use and should not be considered professional advice. While we have taken every precaution to ensure that this article’s content is current and accurate, errors can occur.

The information in this article represents the views and opinions of the authors and does not necessarily represent the views or opinions of QuAIL Technologies Inc. If you have any questions or concerns, please visit quantumai.dev/contact.

--

--

QuAIL Technologies
QuAIL Technologies

QuAIL Technologies researches and develops Quantum Computing and Artificial Intelligence software for the worlds most challenging problems.