The Spiking Neural Network: The Future of Neural Computing That’s Held Back by Today’s Hardware

Dive into the frontier of Spiking Neural Networks, where the promise to revolutionize AI meets the challenge of today’s hardware limitations.

Joshua Alfred Jayapal
NYU Data Science Review
8 min readDec 12, 2023

--

The beginning of Spiking Neural Networks (SNNs) was all about making computer models that more closely mimic how our brains actually work. In the early 1990s, some smart people decided to take a closer look at how our brain cells, or neurons, communicate with each other at a speed close to the speed of light [1]. They were especially interested in the tiny electrical signals, or “spikes,” that neurons use to talk to each other.

Researchers like Eugene Izhikevich and Henry Markram led the way. They thought, “Hey, if we want to make a computer brain, it should send signals like a human brain does.” So, they focused on these spikes, trying to understand their patterns and rhythms. They dug deep, exploring how these spikes could help computer brains process information more like human brains. These guys didn’t just want to make another computer model; they wanted to create something that really gets close to the real, dynamic nature of our brain’s activities. Their hard work and ideas helped shape SNNs, bringing us a bit closer to having computer brains that think and learn like we do [2].

Moreover, SNNs are famously called the “Third generation of neural networks” by exhibiting this unique ability to represent information in the time interval between spikes. The second generation of neural networks (CNNs and RNNs) operate on the frequency of data rather than the timing of individual signals.

The first generation of neural networks is characterized by the perceptron, by Rosenblatt in 1957, which is the primitive model of a biological neuron, capable of solving linear classification problems.

The human brain’s 20-watt power use outperforms current deep learning networks by 100,000 times in energy efficiency. Image by DALL-E.

A Symphony of Spikes: The SNN Advantage

In the fascinating landscape of neural networks, Spiking Neural Networks (SNNs) emerge as curious explorers, intently tuned into the timely, rhythmic dances of neural spikes. They’re not merely passive observers; they’re dynamic participants in the orchestra of neural communication, precisely capturing the rhythmic timings and nuanced pulses of the brain’s electrical chatter.

The key strength of SNNs is their exceptional capacity to accurately capture and utilize the precise timing between electrical impulses, or ‘spikes’, in neural activities. These spikes are timed and the patterns of spikes encode information. SNNs are believed to be much more energy-efficient, potentially capable of more complex and adaptive behaviors, and are better suited for unsupervised and reinforcement learning. SNNs seek to understand, emulate, and leverage these natural rhythms for more biologically plausible computational models.

Neural Precision: Mathematically Modeling the Brain’s Circuitry

Spiking Neural Networks (SNNs) are built upon models like the Leaky Integrate-and-Fire (LIF), which mirror the behavior of biological neurons. In this model, each neuron in an SNN functions similarly to a neuron in the brain. It accumulates electrical energy up to a certain threshold. When this threshold is reached, the neuron emits a spike, comparable to a signal sent by a neuron in the brain. Following this spike, the neuron resets its potential, preparing to accumulate and fire again [3].

The interpretation of these spikes in SNNs can vary. Some models focus on the frequency of these spikes, relating it to how often a neuron fires in response to a stimulus (rate coding). Others may consider the timing of the first spike following a stimulus or the intervals between consecutive spikes. These different approaches provide varied insights into neural activity, reflecting the diverse ways neural firing can encode information in the brain.

SNN’s Learning Dynamics: diagram (a) compares Artificial Neural Networks (ANNs) with ReLU units, and Spiking Neural Networks (SNNs) with LIF neurons, and the complexity of backpropagation in SNNs. Diagram (b) shows a neuron’s membrane potential over time, detailing the process from presynaptic spikes to postsynaptic firing, and introduces surrogate gradients for SNN’s non-differentiable events [4].

Application Horizons of SNNs

SNNs offer a refined approach to speech recognition. Their sensitivity to temporal patterns enables a more authentic and accurate interpretation of vocal inputs, ensuring that the subtleties of speech, such as intonation and pacing, are not overlooked but rather incorporated into the recognition process.

In the field of robotics, SNNs introduce a level of dynamism that is closely aligned with real-world variability and unpredictability. They empower robotic systems to interact and adapt to their environment with enhanced responsiveness and precision, facilitating a more nuanced and effective engagement with external stimuli.

In healthcare and medical diagnostics, SNN’s knack for identifying temporal patterns holds the promise of greatly enhancing the way medical data, like the intricate signals from electroencephalograms (EEG), is analyzed. SNNs have the fine-tuned ability to pick up on the minor fluctuations in physiological data — details that might slip past conventional neural networks. This could pave the way for quicker, more precise diagnosis of medical conditions such as epilepsy or irregular heartbeats, offering the potential to save lives by catching critical health events before they unfold.

Moreover, in financial forecasting, the application of SNNs could be transformative. Their capacity to process and interpret temporally complex data allows for a more insightful analysis of market trends and economic patterns, providing a robust foundation for prediction and strategy formulation.

Confronting the Hardware Dilemma: Challenges in SNN Implementation

Spiking Neural Networks (SNNs), with their intricate structures and event-driven nature, demand a symphony of computational resources for effective implementation and performance. However, traditional computing architectures often struggle to accommodate the unique requirements of SNNs, leading to a pressing need for innovation and adaptation in hardware landscapes. While Spiking Neural Networks (SNNs) currently do not match the performance of Artificial Neural Networks (ANNs), the gap is gradually narrowing over time. However, there are several key factors that continue to restrain the full potential of SNNs [5].

Bridging Architectural Gaps

Modern computing systems have been engineered to operate with a consistent, predictable rhythm, much like a metronome that keeps a steady beat for a musician. This structured, clock-driven approach suits the traditional Artificial Neural Networks (ANNs) well, which process information in a continuous flow.

However, Spiking Neural Networks (SNNs) introduce a different kind of rhythm, one that’s punctuated by bursts of activity and periods of silence — a staccato pattern in our musical analogy. These networks clash with the structured, clock-driven operation of conventional hardware. The result is a mismatch, an awkward dance where neither partner moves in harmony, leading to inefficiencies and lost potential in the realm of neural computing.

This challenge is due to the design implementation of today’s CPUs and GPUs that fail to adhere to SNN’s synchronization. Traditional CPUs and GPUs are based on the von Neumann architecture, designed to execute instructions on a predefined series of steps. The mismatch between SNNs’ asynchronous behavior and traditional computing can cause inefficiencies. For example, simulating the unique timing of each neuron’s spikes requires managing many separate timers and can result in processing delays for neurons that aren’t actively firing.

Neuromorphic Pioneering: A New Dawn

Amidst this conflict, neuromorphic engineering shines as a beacon of hope, illuminating pathways to hardware that echoes the biological finesse and temporal sophistication of SNNs. Neuromorphic computing takes inspiration from the neural structure of the human brain to create a brand new class of hardware that is fundamentally different from traditional computer machines. Here’s a list of some notable neuromorphic chips and their respective creators:

  • Intel’s Loihi: The Loihi neuromorphic chips are designed to natively support the parallel, asynchronous processing of spiking neurons, offering a more efficient platform for SNNs. Its architecture allows each neuron to operate independently and only consumes power when its processing spikes, leading to significant energy savings and improvements for tasks suited to SNNs [6].
  • IBM’s TrueNorth: As part of the SyNAPSE program, IBM developed TrueNorth, a chip designed to emulate the brain’s neurons and synapses in a network of silicon cores and is capable of conducting deep learning tasks with high energy efficiency [7].
  • Neurogrid: Developed by Stanford University, Neurogrid simulates the human cortex’s neural networks and can model one million neurons in real-time [8].

The widespread adoption of neuromorphic technology is limited by factors including production costs, the need for specialized programming models, and a lack of support for the broader ecosystem of tools and libraries that developers rely on. Despite starting from a relatively modest base of USD 0.08 billion in 2023, the market for neuromorphic chips is on an upward trajectory with projections estimating it will grow to USD 2.85 billion by 2028. Anticipated to catalyze a transformation across a broad spectrum of applications, both scientific and commercial, neuromorphic computing is poised to significantly impact various sectors within the next decade [9].

Conclusion

Spiking Neural Networks (SNNs) represent a significant leap towards replicating the intricate workings of the human brain in silicon form, offering the potential to revolutionize artificial intelligence. They capture the precise timing of neural activity, promising more energy-efficient and complex behaviors suited for advanced learning paradigms. Despite their potential, the mismatch between SNNs and current hardware is akin to trying to play a jazz improvisation on an instrument tuned for classical symphonies — it’s possible, but far from ideal. As the market for neuromorphic hardware matures, it promises to unlock new possibilities across various fields, from robotics to healthcare, by harnessing the true power of neural computing and is worth something to know and learn of. Understanding and engaging with this emerging field is not just beneficial; it’s crucial for anyone looking to be at the forefront of innovation in artificial intelligence.

References:

[1] Ghaderi A. H. (2015). Brain activity and special relativity: estimation and a novel hypothesis to explain time perception. Am. J. Psychol. Cogn. Sci. 1, 66–74. http://doi.org/10.13140/RG.2.2.28409.98402

[2] National Geographic (2013), Will We Ever… Simulate the Brain?

[3] Gerstner, W., & Kistler, W. M. (2002). Spiking neuron models: Single neurons, populations, plasticity. Cambridge university press.

[4] Kim, Y., Panda, P. Visual explanations from spiking neural networks using inter-spike intervals. Sci Rep 11, 19037 (2021). https://doi.org/10.1038/s41598-021-98448-0

[5] Tavanaei A, Ghodrati M, Kheradpisheh SR, Masquelier T, Maida A (March 2019). “Deep learning in spiking neural networks”. Neural Networks. 111: 47–63. DOI: 10.1016/j.neunet.2018.12.002.

[6] Loihi: A Neuromorphic Manycore Processor with On-Chip Learning. IEEE Micro, 2018.

[7] Merolla PA, Arthur JV, Alvarez-Icaza R, Cassidy AS, Sawada J, Akopyan F, Jackson BL, Imam N, Guo C, Nakamura Y, Brezzo B, Vo I, Esser SK, Appuswamy R, Taba B, Amir A, Flickner MD, Risk WP, Manohar R, Modha DS. Artificial brains. A million spiking-neuron integrated circuit with a scalable communication network and interface. Science. 2014 Aug 8;345(6197):668–73. doi: 10.1126/science.1254642. Epub 2014 Aug 7. PMID: 25104385.

[8] Stanford (2014), Neurogrid: A Mixed-Analog-Digital Multichip System for Large-Scale Neural Simulations.

[9] Mordor Intelligence (2023), Neuromorphic Chip Market Size.

--

--

Joshua Alfred Jayapal
NYU Data Science Review

MS Computer Science @ NYU | ML Researcher & Data Science Enthusiast