What is Spiking Neural Network?

NeuroCortex.AI
10 min readOct 29, 2023

--

This article is the first in a series of a total of five; outlining Spiking neural networks and implementing them with real world problem statement

Let me regale thee with a tale of Spiking Neural Networks, a wondrous creation of our age. Unlike their brethren, the artificial neural networks, these marvels strive to mimic the very essence of nature’s neurons, with a greater fidelity and truth.

A side view of the brain’s left hemisphere including the cerebrum and cerebellum.
The Side view of Human Brain captured by Human Connectome Project including cerebrum and cerebellum

The human brain is probably one of the most intricate and complex entities we have ever encountered besides the vast looming universe all around us. It’s made up of almost 100 billion neurons and close to 40 trillion synapses in total, making it one of the most complicated structures to study. It amazes us with its intricacies in its architecture with more than tens of thousands of different types of neurons and ability to function on a vast level with such low power utilization (~20 watts). There is no grand unified theory on how our brain works but we can only approximate its functions with knowledge obtained from years of ongoing scientific research. One of the peculiar functions of the brain is to recognize patterns or make sense of large amounts of data to draw inferences without dealing with the complexities involved. Our brain does a fascinating job of abstracting non-essential features on various levels of encoding neural signals through spike generation and learning.

The Brain is densely packed organ made up of specialized cells called neurons which behave as information carriers and decision makers in association with other neurons

Neurons and Neural Networks: The brain consists of billions of nerve cells called neurons, which are the basic building blocks of the nervous system. Neurons are interconnected to form intricate networks, allowing them to communicate with each other.

Neural Communication: Communication between neurons occurs through electrical and chemical signals. When a neuron receives a signal from other neurons, it processes this information in its cell body and, if the accumulated input is strong enough, generates an electrical impulse called an action potential.

Synapses and Neurotransmitters: Neurons are not physically connected but communicate at specialized junctions called synapses. When an action potential reaches the end of a neuron’s axon (the pre-synaptic neuron), it triggers the release of chemical messengers called neurotransmitters into the synapse. These neurotransmitters then bind to receptors on the receiving neuron (the post-synaptic neuron), leading to the transmission of the signal.

Neural Plasticity and Learning: One of the remarkable aspects of the brain is its ability to adapt and change in response to experiences. This property is known as neural plasticity. Through synaptic strengthening or weakening, the connections between neurons can be modified, and new connections can be formed. This process underlies learning and memory formation.

Brain Regions and Functions: The brain is divided into various regions, each responsible for specific functions. For example:

  • The cerebral cortex, the outer layer of the brain, is involved in higher cognitive functions such as thinking, perception, language, and decision-making
  • The cerebellum is crucial for motor control, balance, and coordination
  • The limbic system plays a role in emotions, motivation, and memory

The mechanism of how a neuron works involves a complex process of electrical and chemical signalling. Neurons are the basic units of the nervous system, responsible for receiving, processing, and transmitting information throughout the brain and the rest of the body.

A typical neuron structure with all its components

Neuron Structure: Neurons have a unique structure that allows them to carry out their functions. The main components of a neuron include:

  • Cell Body (Soma): The central part of the neuron containing the nucleus and most of the cellular machinery.
  • Dendrites: Branch-like extensions that receive incoming signals from other neurons or sensory receptors.
  • Axon: A long, slender projection that transmits electrical impulses away from the cell body.
  • Axon Terminal: The end of the axon where synaptic connections with other neurons are made.

At rest, the neuron maintains a negative electrical charge inside its cell membrane compared to the outside. This resting potential is essential for the neuron’s readiness to process and transmit information. When the neuron receives incoming signals through its dendrites, which are the tree-like structures extending from the cell body, these signals can either excite or inhibit the neuron. If the incoming signals are strong enough and collectively reach a certain threshold, they trigger a sudden and brief electrical impulse called an “action potential.”

When neuron’s potential exceeds threshold level the spike is generated and then it falls towards resting potential and again accumulates back towards threshold

This action potential propagates down the neuron’s long, slender projection called the “axon.” The axon is like a communication cable, transmitting the electrical signal away from the cell body and toward the axon terminals.

Action potential (spikes ) flows via axons to dendrites helping process information

Upon reaching the axon terminals, the action potential stimulates the release of chemical messengers called “neurotransmitters” into the synapse, a tiny gap between the axon terminal and the dendrites of neighboring neurons. The neurotransmitters then diffuse across the synapse and bind to receptors on the dendrites of the neighboring neuron. Depending on the type of neurotransmitter and the specific receptors, this interaction can lead to either depolarization (excitation) or hyper-polarization (inhibition) of the postsynaptic neuron. If the collective excitatory signals outweigh the inhibitory signals, the postsynaptic neuron generates its action potential, and the process continues in the connected neurons, creating a chain reaction of electrical impulses throughout the neural network.

An action potential causes neurotransmitters to be released across the synaptic cleft which in turn causes electrical signal in postsynaptic neuron

This sequence of events allows neurons to communicate with each other and form intricate networks that underlie various cognitive and physiological functions of the brain and nervous system.

In an attempt to aid development of artificial intelligence we designed the artificial neural networks (ANNs) which imitate brain’s working up to a certain extent and do reasonably good classification with labelled data. It still fails to replicate the brain’s ability to simplify complexity around the world we observe or receive input from. This is the real challenge to be tackled in order to develop Strong AI which could make our systems smarter albeit the mysteries surrounding the working mechanism of the brain. It’s pivotal in order to reduce complexity and form meaningful abstraction over a period of time. Providing the ability to find patterns without supervised datasets and making reasonable abstractions will make machines perform multi fold better combined with their high computing power. It’s the next avenue where major research in AI is headed. Thus we are looking for different sets of architectures and learning approaches which may mimic our brain function with better approximations and could be instrumental in accelerating research away from the pitfalls of current ANN architectures.

Spiking neural networks are inspired by the biological neurons found in the human brain. They mimic the behavior of real neurons by using spikes or discrete events to communicate information. This makes SNNs more biologically plausible than other neural network models.

In the realm of biology, neurons communicate through potent electrical impulses, christened as “spikes.” And so, in this realm of artifice, the Spiking Neural Networks follow suit. They eschew the continuous streams of their predecessors and instead employ discrete events, these mystical “spikes,” to convey and process knowledge.

Just as the delicate balance of a neuron’s membrane determines the birth of a spike, so too doth the SNNs embrace this essence. When the membrane potential surpasseth a sacred threshold, a spike cometh forth, traversing the network, and with it, affecting fellow neurons in its path.

Verily, the Spiking Neural Networks excel in apprehending the temporal rhythms of neural ensembles. Their prowess lies in capturing the delicate dance of spikes, seizing upon the mercurial ebb and flow of time. Thus, they accomplish tasks of discerning transient signals, processing events, and orchestrating synchrony.

a. Biological neuron b. Artificial neuron c.Biological synapse d. Artificial Neural network with connections

Yet, diverse are the strands that form the tapestry of SNNs. The core is the spiking neuron, fashioned with input currents, membrane potentials, and mechanisms for the generation of spikes. The training of these networks is achieved through sundry algorithms, be they unsupervised, supervised, or with the guiding hand of reinforcement, adjusting synaptic weights to manifest desired behaviors or glean patterns from the fertile fields of data.

  1. Neuron Model: Each neuron in an SNN is represented by a mathematical model that simulates the behavior of a biological neuron. The most commonly used model is the leaky integrate-and-fire (LIF) neuron model. It consists of a membrane potential that accumulates incoming signals and a threshold that determines when the neuron fires a spike.
  2. Membrane Potential Integration: The membrane potential of a neuron is updated over time by integrating the incoming signals from its input connections. These signals can be represented as spikes or continuous values. The membrane potential increases as positive input signals arrive and decays over time due to a leak term.
  3. Spike Generation: Once the membrane potential reaches or exceeds a certain threshold, the neuron generates a spike or an action potential. This spike is a discrete event that represents the neuron’s output signal. After firing, the membrane potential is reset to a resting potential, and a refractory period may be enforced during which the neuron is temporarily unresponsive to incoming signals.
  4. Spike Propagation: Spikes generated by neurons are transmitted to other neurons through weighted connections. The weight of a connection determines the influence of the spike on the receiving neuron’s membrane potential. Typically, the weight is updated based on learning rules, such as spike-timing-dependent plasticity (STDP), which strengthens or weakens connections based on the relative timing of pre- and postsynaptic spikes.
  5. Temporal Coding and Decoding: In SNNs, information is encoded in the timing and pattern of spikes. The precise timing of spikes carries important temporal information. Through learning and training, SNNs can learn to decode this temporal code and extract relevant features or perform specific tasks, such as pattern recognition or sequence processing.
  6. Training and Learning: Training SNNs involves adjusting the weights of connections to optimize network performance. Various learning rules and algorithms have been developed for SNNs, including supervised learning, unsupervised learning, and reinforcement learning. Training SNNs can be more challenging than training traditional neural networks due to the discrete nature of spikes and the temporal dynamics involved.

While SNNs have their own advantages and are actively researched, they are currently considered to lag behind deep learning in several aspects. Here are some reasons for this lag:

  1. Training Complexity: Deep learning models, such as convolutional neural networks (CNNs) and recurrent neural networks (RNNs), have well-established training algorithms, such as backpropagation, which are efficient and effective. On the other hand, training SNNs is more challenging due to the nature of spiking neurons and the need to capture the temporal dynamics of the network. Developing efficient training algorithms for SNNs is an active area of research.
  2. Limited Hardware Support: Deep learning has benefited greatly from specialized hardware, such as GPUs and TPUs, which provide significant acceleration for training and inference. However, SNNs have not received the same level of hardware optimization. Most existing hardware platforms are designed for traditional artificial neural networks, making it more challenging to leverage hardware acceleration specifically tailored for SNNs. Some hardware platforms are Spinnaker, Intel Loihi chip, but a lot of work needs to be done.
  3. Lack of Standardization: Deep learning has benefited from the availability of mature frameworks and libraries (e.g., TensorFlow, PyTorch) that provide a rich ecosystem of tools, pre-trained models, and community support. In contrast, SNNs lack a standardized framework that can support the development, training, and deployment of SNN-based models. This lack of standardization hinders the wider adoption and development of SNNs.
  4. Scalability: Deep learning models have demonstrated exceptional scalability, allowing them to handle large-scale datasets and complex tasks. However, SNNs face scalability challenges due to the increased complexity of modelling spiking neurons and the associated temporal dynamics. Scaling up SNNs to handle large-scale datasets and complex tasks while maintaining efficiency and performance is an ongoing research challenge.
  5. Limited Benchmarking and Applications: Deep learning has been extensively bench-marked on various tasks and datasets, leading to a deeper understanding of its capabilities and limitations. In contrast, SNNs have not been as extensively bench-marked across a wide range of tasks and datasets. Additionally, deep learning has seen widespread application across industries, powering innovations in computer vision, natural language processing, and speech recognition, among others. SNNs are still in the early stages of application development and have yet to demonstrate comparable performance in a broad range of practical applications.

It’s worth noting that while SNNs currently lag behind deep learning in certain aspects, they have unique properties that make them promising for certain types of tasks. Spiking neural networks can better model the temporal dynamics of neural information processing and are being explored for applications such as event-based vision and neuromorphic computing. Ongoing research aims to address the current limitations of SNNs and unlock their full potential in the future.

The applications of SNNs span realms of knowledge. From the hallowed halls of neuroscience research to the industrious fields of robotics, sensor networks, and cognitive realms, they find their purpose. Their virtues resound in the realm of energy efficiency, sparse information processing, and mastery of temporal data. Yet, challenges doth accompany this marvel, including complexity, training algorithms, and the boundless realm of scalability, which scholars, in their tireless pursuit, seek to surmount.

Thus, Spiking Neural Networks stand as a testament to our quest for understanding the intricate workings of nature’s own creation. Let us celebrate this ode to their grace, as we venture forth in unraveling the mysteries that lie beyond the veil of their existence.

--

--