Advances, Applications and Future of Spiking Neural Networks

NeuroCortex.AI
10 min readDec 19, 2023

--

This is the fourth part of the five-part series on spiking neural networks.

Part 1 here: What is Spiking Neural Network?. This article is the first in a series… | by NeuroCortex.AI | Oct, 2023 | Medium

Part 2 here: What makes Spiking neural network tick? | by NeuroCortex.AI | Nov, 2023 | Medium

Part 3 here: Spiking Neural Network Architectures | by NeuroCortex.AI | Dec, 2023 | Medium

Two spiking neurons as imagined by Dalle -3 by OpenAI

Spiking Neural Networks (SNNs) represent a class of artificial neural networks inspired by the way biological neurons communicate through spikes or action potentials. They differ from traditional artificial neural networks, such as feed-forward and recurrent neural networks, in their temporal processing and event-driven nature.

Spiking Neural Networks Real Life Applications and Use Cases

The Spiking Neural Networks specific network topology, which opens up a wide range of possibilities in the fields of robotics and computer vision, has sparked a lot of interest in the AI community. The main benefit is using neuromorphic hardware for in-memory computing in Spiking Neural Networks.

Spiking Neural Networks can be used and can be applied in many different industries, including:

  • Prosthetics: As of today, there are already visual and auditory neuro-prostheses, which use spike trains to send signals to the visual cortex and return the ability to orient in space to the patients. Also, scientists are working on mechanical motor prostheses that use the same approach. Moreover, spike trains can be supplied to the brain through implanted electrodes and, thereby, eliminate the symptoms of Parkinson’s disease, dystonia, chronic pain, and schizophrenia.
  • Robotics: Brain Corporation based in San Diego develops robots using SNNs, whereas SyNAPSE develops neuromorphic systems and processors.
  • Computer Vision: Computer Vision is the sphere that can strongly benefit from using SNNs for automatic video analysis. The IBM TrueNorth digital neurochip can help with that as it includes one million programmable neurons and 256 million programmable synapses to simulate the functioning of neurons in the visual cortex. This neurochip is often considered the first hardware tool that was specifically designed to work with SNNs.
  • Telecommunications: Qualcomm is actively researching the possibility of integrating SNNs in telecommunication devices.
Artificial brain is all too realistic in our near future
  1. Neuromorphic Computing: SNNs are crucial for the development of neuromorphic computing systems that mimic the brain’s structure and functionality. These systems have applications in robotics, sensory processing, and cognitive computing.

IBM TrueNorth: Developed by IBM, TrueNorth is a neuromorphic chip designed to emulate the parallel processing capabilities of the human brain. It consists of 4,096 neurosynaptic cores, with each core containing 256 programmable neurons and 256x256 programmable synapses.

TrueNorth: Design and Tool Flow of a 65 mW 1 Million Neuron Programmable Neurosynaptic Chip for IEEE TCADIS | IBM Research

TrueNorth: A Deep Dive into IBM’s Neuromorphic Chip Design — Open Neuromorphic (open-neuromorphic.org)

SpiNNaker (Spiking Neural Network Architecture): The SpiNNaker project, led by the University of Manchester, aims to create a massively parallel computer system that simulates spiking neural networks. The SpiNNaker chip, developed for this purpose, can simulate large-scale neural networks in real-time.

Research Groups: APT — Advanced Processor Technologies (School of Computer Science — The University of Manchester)

BrainScaleS Project: The BrainScaleS project, part of the European Human Brain Project, focuses on developing neuromorphic hardware. The BrainScaleS system consists of physical models of neurons and synapses implemented on specialized hardware to simulate brain-like processing.

BrainScales BrainScaleS today (2020–2023) (uni-heidelberg.de)

Frontiers | The BrainScaleS-2 Accelerated Neuromorphic System With Hybrid Plasticity (frontiersin.org)

NeuroGrid: NeuroGrid is a neuromorphic computing platform developed at Stanford University. It is designed to simulate the behavior of a million neurons in real-time and has been used for neuroscience research and simulations of neural activity.

NeuroGrid: recording action potentials from the surface of the brain | Nature Neuroscience

web.stanford.edu/group/brainsinsilicon/documents/BenjaminEtAlNeurogrid2014.pdf

Loihi: Loihi is a neuromorphic chip developed by Intel Labs. It features a large number of artificial neurons and synapses and is designed for real-time learning applications. The chip supports online learning and can adapt to changing environments.

Neuromorphic Computing and Engineering with AI | Intel®

Loihi: A Neuromorphic Manycore Processor with On-Chip Learning (berkeley.edu)

BrainChip Akida: BrainChip’s Akida Neuromorphic System-on-Chip (NSoC) is designed for edge computing applications. It is capable of learning from incoming data and making decisions in real-time. The chip is aimed at applications such as vision sensors, audio analysis, and other sensor-based tasks.

Products — Akida Neural Processor SoC — BrainChip

Akida Generations — BrainChip

2. Spiking Neural Network Hardware: Application-specific hardware implementations of SNNs are being used in edge computing devices for tasks like image and speech recognition, where low power consumption and real-time processing are essential.

3. Brain-Machine Interfaces: SNNs are employed in brain-machine interfaces for decoding neural signals, allowing users to control external devices through direct brain communication.

4. Cognitive Robotics: SNNs play a role in developing more biologically inspired robots that can adapt and learn in real-time, making them suitable for complex and dynamic environments.

5. Pattern Recognition: SNNs are used for pattern recognition tasks, especially in scenarios where the temporal dynamics of data are crucial, such as in speech and gesture recognition.

Future of Spiking Neural Networks

There are two opinions on SNNs among Data Scientists: a skeptical and an optimistic one.

Optimists think that SNNs are the future, because:

  • They are the logical step in NNs evolution;
  • In theory, they are more powerful that traditional ANNs;
  • There are already SNNs implementations that show the potential of SNNs.
  1. Enhanced Learning Rules: Continued research into more sophisticated learning rules for SNNs to improve their ability to learn from complex and diverse datasets.
  2. Scalability: Addressing scalability issues to enable the training of larger and more complex spiking neural networks for handling real-world applications.
  3. Interfacing with Traditional AI: Further integration of SNNs with conventional artificial neural networks to create more powerful and versatile hybrid models.
  4. Applications in Healthcare: Expanding the use of SNNs in healthcare for tasks such as medical image analysis, disease diagnosis, and personalized medicine.
  5. Cognitive Computing: Advancements in SNNs to support the development of more sophisticated cognitive computing systems that can simulate higher-order brain functions.
  6. Understanding Biological Systems: Using SNNs as tools for understanding and simulating complex biological systems, leading to insights into brain function and potential therapeutic interventions.

On the other hand, skeptics feel that SNNs are overrated for several reasons:

  • There is no learning method designed specifically for SNNs;
  • Effective working with SNNs requires specialized hardware;
  • They are not commonly used across the industries remaining either a niche solution or a fancy idea;
  • SNNs are less interpretable than ANNs;
  • There are more theoretical articles on SNNs than the practical ones;
  • Despite being around for a while, there is still no massive breakthrough in the SNNs sphere.

Thus, the future of SNNs is unclear. From my perspective, SNNs simply need a bit more time and research before they become relevant.

Artistic rendering of neuronal connection

Advantages and disadvantages of Spiking Neural Networks

Spiking Neural Networks have several clear advantages over the traditional NNs. The advantages and disadvantages of Spiking Neural Networks are given in the following table:

  1. SNN is dynamic. Thus, it excels at working with dynamic processes such as speech and dynamic image recognition;
  2. An SNN can still train when it is already working;
  3. You need to train only the output neurons to train an SNN;
  4. SNNs usually have fewer neurons than the traditional ANNs;
  5. SNNs can work very fast since the neurons will send impulses not a continuous value;
  6. SNNs have increased productivity of information processing and noise immunity since they use the temporal presentation of information.

Unfortunately, SNNs also have two major disadvantages:

  1. SNNs are hard to train. As of today, there is no learning method designed specifically for this task;
  2. It is impractical to build a small SNN.

Challenges of Spiking Neural Networks

In theory, SNNs are more powerful than the current generation of NNs. Still, there are two serious challenges that need to be solved before SNNs will be widely used:

  1. The first challenge comes from the lack of the learning method developed specifically for the SNN training. Specifics of SNN operations do not allow Data Scientists to effectively use traditional learning methods, for example, gradient descent. Sure, there are Unsupervised biological learning methods that can be used to train an SNN. However, it will be time-consuming and irrelevant since a traditional ANN will learn both faster and better;
  2. The second one is hardware. Working with SNNs is computationally expensive as it requires solving many differential equations. Thus, you will not be able to effectively work locally without having specialised hardware.

Advances of Spiking Neural Network:

  1. Biological Inspiration: Advances in neuroscientific research have contributed to a better understanding of the brain’s functioning, leading to improved models of spiking neurons and their interactions.
  2. Learning Algorithms: Development of learning algorithms specifically tailored for SNNs, such as Spike-Timing-Dependent Plasticity (STDP), which accounts for the precise timing of spikes in the learning process.
  3. Hardware Implementations: Specialized hardware accelerators and neuromorphic chips designed to efficiently simulate and process spiking neural networks, improving energy efficiency and real-time performance.
  4. Deep Spiking Neural Networks: The extension of SNNs into deeper architectures, enabling the development of deep learning models that can capitalize on the advantages of spiking neurons.
  5. Hybrid Models: Integration of SNNs with traditional artificial neural networks to create hybrid models that combine the strengths of both approaches, allowing for more flexible and powerful learning.
  6. Event-Driven Processing: Utilization of the event-driven nature of SNNs for efficient processing of temporal data, making them suitable for applications involving time-series data and event-based sensing.

5. Resources for learning about Spiking Neural Networks

Spiking Neural Networks (SNNs) are a specific type of artificial neural network that use a more biologically-inspired approach, modeling neurons as entities that communicate through spikes or pulses. The field of Spiking Neural Networks (SNNs) is continually evolving, and researchers continue to contribute valuable insights. Here are some research papers and books that cover various aspects of Spiking Neural Networks:

Research Papers:

  1. Title: “Spiking Neuron Models: Single Neurons, Populations, Plasticity” Authors: Wulfram Gerstner, Werner M. Kistler, Richard Naud, and Liam Paninski Link: Spiking Neuron Models
  2. Title: “Neuronal Dynamics: From Single Neurons to Networks and Models of Cognition” Authors: Wulfram Gerstner and Werner M. Kistler Link: Neuronal Dynamics
  3. Title: “Biological learning curves outperform existing ones in neural networks” Authors: Rui Xu, Bo Li, Chunpeng Wu, Yansong Chua, and Bertram E. Shi Link: Biological learning curves
  4. Title: “Spiking Neural Networks for Computer Vision” Authors: Sander M. Bohte, Jorg Conradt, and Eugene M. Izhikevich Link: Spiking Neural Networks for Computer Vision
  5. Title: “Spiking neural networks: principles and challenges” Authors: Izhikevich, E. M. Link: Spiking neural networks: principles and challenges
  6. Title: “Learning to Communicate with Spiking Neurons” Authors: S. Diehl, D. Neil, J. Binas, M. Cook, S.-C. Liu, and M. Pfeiffer Link: Learning to Communicate with Spiking Neurons
  7. Title: “Deep Spiking Neural Networks With LIF Neurons” Authors: S. K. Esser, R. Appuswamy, P. Merolla, J. V. Arthur, and D. S. Modha Link: Deep Spiking Neural Networks With LIF Neurons
  8. Title: “Surrogate Gradient Learning in Spiking Neural Networks: Bringing the Wisdom of Hodgkin–Huxley into Deep Learning” Authors: Friedemann Zenke, Surya Ganguli Link: Surrogate Gradient Learning in Spiking Neural Networks
  9. Title: “Training deep spiking neural networks using backpropagation” Authors: Guillaume Bellec, Darjan Salaj, Anand Subramoney, Robert Legenstein, Wolfgang Maass Link: Training deep spiking neural networks using backpropagation

Books:

  1. Title: “Principles of Neural Coding” Authors: Rodrigo Quian Quiroga Link: Principles of Neural Coding
  2. Title: “Spiking Neuron Models” Authors: Wulfram Gerstner, Werner M. Kistler, Richard Naud, and Liam Paninski Link: Spiking Neuron Models
  3. Title: “Neural Engineering: Computation, Representation, and Dynamics in Neurobiological Systems” Authors: Chris Eliasmith and Charles H. Anderson Link: Neural Engineering
  4. Title: “Neural Networks and Learning Machines” Authors: Simon O. Haykin Link: Neural Networks and Learning Machines
  5. Title: “Neuronal Dynamics: From Single Neurons to Networks and Models of Cognition” Authors: Wulfram Gerstner and Werner M. Kistler Link: Neuronal Dynamics
  6. Title: “Theoretical Neuroscience: Computational and Mathematical Modeling of Neural Systems” Authors: Peter Dayan and Laurence Abott Link: Theoretical Neuroscience
  7. Title: “Spiking Neural Networks: Theory and Applications” Authors: Eugenio U. Hernandez, Aurel A. Lazar Link: Spiking Neural Networks: Theory and Applications
  8. Title: “Spiking Neural Networks for Artificial Intelligence” Authors: K. I. Diamantaras, S. I. Fotopoulos Link: Spiking Neural Networks for Artificial Intelligence
  9. Title: “Spiking Neuron Models: An Introduction” Authors: Thorpe, S., & Gautrais, J. Link: Spiking Neuron Models: An Introduction
  10. Title: “Spiking Neural Networks: From Concept to Implementation” Authors: Bernd Porr, Florentin Wörgötter Link: Spiking Neural Networks: From Concept to Implementation

The annual SNUFA online workshop brings together researchers in spiking neural networks to present their work and discuss translating these findings into a better understanding of neural circuits and novel brain-inspired computing approaches. Topics of interest include artificial and biologically plausible learning algorithms and the dissection of trained spiking circuits toward understanding neural processing.

Link : SNUFA | Spiking Neural networks as Universal Function Approximators

Here are some more free resources mentioned to help you get started with understanding and implementing spiking neural networks (Some links might be changed, do your own search as well) :

These resources cover a range of topics, including the theory, applications, and training methodologies for spiking neural networks. Keep in mind that the field is rapidly advancing, so it’s beneficial to explore the latest conference proceedings and journals for cutting-edge research in spiking neural networks.

6. Conclusion

The Spiking Neural Networks unquestionably represents a milestone for the AI industry. However, on the other hand, skeptics believe that Spiking Neural Networks are overrated. In this article, we learnt about the definition of a Spiking Neural Network, how it works, use cases of Spiking Neural Networks, and the advantages and disadvantages of Spiking Neural Network. We’ll be focusing on development using SNN libraries for real time use cases. Stay tuned.

If you enjoyed this post, a great next step would be to start exploring the field, trying to learn as much as possible about Spiking Neural Networks.

--

--