Spiking Neural Network
   HOME

TheInfoList



OR:

Spiking neural networks (SNNs) are
artificial neural network In machine learning, a neural network (also artificial neural network or neural net, abbreviated ANN or NN) is a computational model inspired by the structure and functions of biological neural networks. A neural network consists of connected ...
s (ANN) that mimic natural neural networks. These models leverage timing of discrete spikes as the main information carrier. In addition to
neuronal A neuron (American English), neurone (British English), or nerve cell, is an excitable cell that fires electric signals called action potentials across a neural network in the nervous system. They are located in the nervous system and help to ...
and synaptic state, SNNs incorporate the concept of time into their operating model. The idea is that
neurons A neuron (American English), neurone (British English), or nerve cell, is an membrane potential#Cell excitability, excitable cell (biology), cell that fires electric signals called action potentials across a neural network (biology), neural net ...
in the SNN do not transmit information at each propagation cycle (as it happens with typical multi-layer perceptron networks), but rather transmit information only when a
membrane potential Membrane potential (also transmembrane potential or membrane voltage) is the difference in electric potential between the interior and the exterior of a biological cell. It equals the interior potential minus the exterior potential. This is th ...
—an intrinsic quality of the neuron related to its
membrane A membrane is a selective barrier; it allows some things to pass through but stops others. Such things may be molecules, ions, or other small particles. Membranes can be generally classified into synthetic membranes and biological membranes. Bi ...
electrical charge—reaches a specific value, called the threshold. When the membrane potential reaches the threshold, the neuron fires, and generates a signal that travels to other neurons which, in turn, increase or decrease their potentials in response to this signal. A neuron model that fires at the moment of threshold crossing is also called a spiking neuron model. While spike rates can be considered the analogue of the variable output of a traditional ANN, neurobiology research indicated that high speed processing cannot be performed solely through a rate-based scheme. For example humans can perform an image recognition task requiring no more than 10ms of processing time per neuron through the successive layers (going from the
retina The retina (; or retinas) is the innermost, photosensitivity, light-sensitive layer of tissue (biology), tissue of the eye of most vertebrates and some Mollusca, molluscs. The optics of the eye create a focus (optics), focused two-dimensional ...
to the
temporal lobe The temporal lobe is one of the four major lobes of the cerebral cortex in the brain of mammals. The temporal lobe is located beneath the lateral fissure on both cerebral hemispheres of the mammalian brain. The temporal lobe is involved in pr ...
). This time window is too short for rate-based encoding. The precise spike timings in a small set of spiking neurons also has a higher information coding capacity compared with a rate-based approach. The most prominent spiking neuron model is the leaky integrate-and-fire model. In that model, the momentary activation level (modeled as a differential equation) is normally considered to be the neuron's state, with incoming spikes pushing this value higher or lower, until the state eventually either decays or—if the firing threshold is reached—the neuron fires. After firing, the state variable is reset to a lower value. Various decoding methods exist for interpreting the outgoing ''
spike train An action potential (also known as a nerve impulse or "spike" when in a neuron) is a series of quick changes in voltage across a cell membrane. An action potential occurs when the membrane potential of a specific cell rapidly rises and falls. ...
'' as a real-value number, relying on either the frequency of spikes ( rate-code), the time-to-first-spike after stimulation, or the interval between spikes.


History

Many multi-layer artificial neural networks are fully connected, receiving input from every neuron in the previous layer and signalling every neuron in the subsequent layer. Although these networks have achieved breakthroughs, they do not match biological networks and do not mimic neurons. The biology-inspired
Hodgkin–Huxley model The Hodgkin–Huxley model, or conductance-based model, is a mathematical model that describes how action potentials in neurons are initiated and propagated. It is a set of nonlinear differential equations that approximates the electrical engine ...
of a spiking neuron was proposed in 1952. This model described how
action potential An action potential (also known as a nerve impulse or "spike" when in a neuron) is a series of quick changes in voltage across a cell membrane. An action potential occurs when the membrane potential of a specific Cell (biology), cell rapidly ri ...
s are initiated and propagated. Communication between neurons, which requires the exchange of chemical
neurotransmitter A neurotransmitter is a signaling molecule secreted by a neuron to affect another cell across a Chemical synapse, synapse. The cell receiving the signal, or target cell, may be another neuron, but could also be a gland or muscle cell. Neurotra ...
s in the synaptic gap, is described in models such as the
integrate-and-fire Biological neuron models, also known as spiking neuron models, are mathematical descriptions of the conduction of electrical signals in neurons. Neurons (or nerve cells) are electrically excitable cells within the nervous system, able to fire ...
model,
FitzHugh–Nagumo model The FitzHugh–Nagumo model (FHN) describes a prototype of an excitable system (e.g., a neuron A neuron (American English), neurone (British English), or nerve cell, is an membrane potential#Cell excitability, excitable cell (biology), cell t ...
(1961–1962), and Hindmarsh–Rose model (1984). The leaky integrate-and-fire model (or a derivative) is commonly used as it is easier to compute than Hodgkin–Huxley. While the notion of an artificial spiking neural network became popular only in the twenty-first century, studies between 1980 and 1995 supported the concept. The first models of this type of ANN appeared to simulate non-algorithmic intelligent information processing systems. However, the notion of the spiking neural network as a mathematical model was first worked on in the early 1970s. As of 2019 SNNs lagged behind ANNs in accuracy, but the gap is decreasing, and has vanished on some tasks.


Underpinnings

Information in the brain is represented as action potentials (neuron spikes), which may group into spike trains or coordinated waves. A fundamental question of neuroscience is to determine whether neurons communicate by a rate or temporal code.
Temporal coding Neural coding (or neural representation) is a neuroscience field concerned with characterising the hypothetical relationship between the stimulus and the neuronal responses, and the relationship among the electrical activities of the neurons in t ...
implies that a single spiking neuron can replace hundreds of hidden units on a conventional neural net''.'' SNNs define a neuron's current state as its potential (possibly modeled as a differential equation). An input pulse causes the potential to rise and then gradually decline. Encoding schemes can interpret these pulse sequences as a number, considering pulse frequency and pulse interval. Using the precise time of pulse occurrence, a neural network can consider more information and offer better computing properties. SNNs compute in the continuous domain. Such neurons test for activation only when their potentials reach a certain value. When a neuron is activated, it produces a signal that is passed to connected neurons, accordingly raising or lowering their potentials. The SNN approach produces a continuous output instead of the binary output of traditional ANNs. Pulse trains are not easily interpretable, hence the need for encoding schemes. However, a pulse train representation may be more suited for processing spatiotemporal data (or real-world sensory data classification). SNNs connect neurons only to nearby neurons so that they process input blocks separately (similar to
CNN Cable News Network (CNN) is a multinational news organization operating, most notably, a website and a TV channel headquartered in Atlanta. Founded in 1980 by American media proprietor Ted Turner and Reese Schonfeld as a 24-hour cable ne ...
using filters). They consider time by encoding information as pulse trains so as not to lose information. This avoids the complexity of a
recurrent neural network Recurrent neural networks (RNNs) are a class of artificial neural networks designed for processing sequential data, such as text, speech, and time series, where the order of elements is important. Unlike feedforward neural networks, which proces ...
(RNN). Impulse neurons are more powerful computational units than traditional artificial neurons. SNNs are theoretically more powerful than so called "second-generation networks" defined as ANNs "based on computational units that apply activation function with a continuous set of possible output values to a weighted sum (or polynomial) of the inputs"; however, SNN training issues and hardware requirements limit their use. Although unsupervised biologically inspired learning methods are available such as
Hebbian learning Hebbian theory is a neuropsychological theory claiming that an increase in synaptic efficacy arises from a presynaptic cell's repeated and persistent stimulation of a postsynaptic cell. It is an attempt to explain synaptic plasticity, the adaptat ...
and
STDP Spike-timing-dependent plasticity (STDP) is a biological process that adjusts the strength of synaptic connections between neurons based on the relative timing of their action potentials (or spikes). It is a temporally sensitive form of synaptic p ...
, no effective supervised training method is suitable for SNNs that can provide better performance than second-generation networks. Spike-based activation of SNNs is not differentiable, thus
gradient descent Gradient descent is a method for unconstrained mathematical optimization. It is a first-order iterative algorithm for minimizing a differentiable multivariate function. The idea is to take repeated steps in the opposite direction of the gradi ...
-based
backpropagation In machine learning, backpropagation is a gradient computation method commonly used for training a neural network to compute its parameter updates. It is an efficient application of the chain rule to neural networks. Backpropagation computes th ...
(BP) is not available. SNNs have much larger computational costs for simulating realistic neural models than traditional ANNs. Pulse-coupled neural networks (PCNN) are often confused with SNNs. A PCNN can be seen as a kind of SNN. Researchers are actively working on various topics. The first concerns differentiability. The expressions for both the forward- and backward-learning methods contain the derivative of the neural activation function which is not differentiable because a neuron's output is either 1 when it spikes, and 0 otherwise. This all-or-nothing behavior disrupts gradients and makes these neurons unsuitable for gradient-based optimization. Approaches to resolving it include: * resorting to entirely biologically inspired local learning rules for the hidden units * translating conventionally trained “rate-based” NNs to SNNs * smoothing the network model to be continuously differentiable * defining an SG (Surrogate Gradient) as a continuous relaxation of the real gradients The second concerns the optimization algorithm. Standard BP can be expensive in terms of computation, memory, and communication and may be poorly suited to the hardware that implements it (e.g., a computer, brain, or neuromorphic device). Incorporating additional neuron dynamics such as Spike Frequency Adaptation (SFA) is a notable advance, enhancing efficiency and computational power. These neurons sit between biological complexity and computational complexity. Originating from biological insights, SFA offers significant computational benefits by reducing power usage, especially in cases of repetitive or intense stimuli. This adaptation improves signal/noise clarity and introduces an elementary short-term memory at the neuron level, which in turn, improves accuracy and efficiency. This was mostly achieved using
compartmental neuron models Compartmental modelling of dendrites deals with multi-compartment modelling of the dendrites, to make the understanding of the electrical behavior of complex dendrites easier. Basically, compartmental modelling of dendrites is a very helpful tool ...
. The simpler versions are of neuron models with adaptive thresholds, are an indirect way of achieving SFA. It equips SNNs with improved learning capabilities, even with constrained synaptic plasticity, and elevates computational efficiency. This feature lessens the demand on network layers by decreasing the need for spike processing, thus lowering computational load and memory access time—essential aspects of neural computation. Moreover, SNNs utilizing neurons capable of SFA achieve levels of accuracy that rival those of conventional ANNs, while also requiring fewer neurons for comparable tasks. This efficiency streamlines the computational workflow and conserves space and energy, while maintaining technical integrity. High-performance deep spiking neural networks can operate with 0.3 spikes per neuron.


Applications

SNNs can in principle be applied to the same applications as traditional ANNs. In addition, SNNs can model the
central nervous system The central nervous system (CNS) is the part of the nervous system consisting primarily of the brain, spinal cord and retina. The CNS is so named because the brain integrates the received information and coordinates and influences the activity o ...
of biological organisms, such as an insect seeking food without prior knowledge of the environment. Due to their relative realism, they can be used to study biological neural circuits. Starting with a hypothesis about the topology of a biological neuronal circuit and its function,
recordings A record, recording or records may refer to: An item or collection of data Computing * Record (computer science), a data structure ** Record, or row (database), a set of fields in a database related to one entity ** Boot sector or boot record, re ...
of this circuit can be compared to the output of a corresponding SNN, evaluating the plausibility of the hypothesis. SNNs lack effective training mechanisms, which can complicate some applications, including computer vision. When using SNNs for image based data, the images need to be converted into binary spike trains. Types of encodings include: * Temporal coding; generating one spike per neuron, in which spike latency is inversely proportional to the pixel intensity. * Rate coding: converting pixel intensity into a spike train, where the number of spikes is proportional to the pixel intensity. * Direct coding; using a trainable layer to generate a floating-point value for each time step. The layer converts each pixel at a certain time step into a floating-point value, and then a threshold is used on the generated floating-point values to pick either zero or one. * Phase coding; encoding temporal information into spike patterns based on a global oscillator. * Burst coding; transmitting spikes in bursts, increasing communication reliability.


Software

A diverse range of
application software Application software is any computer program that is intended for end-user use not operating, administering or programming the computer. An application (app, application program, software application) is any program that can be categorized as ...
can simulate SNNs. This software can be classified according to its uses:


SNN simulation

These simulate complex neural models. Large networks usually require lengthy processing. Candidates include: *
Brian Brian (sometimes spelled Bryan (given name), Bryan in English) is a male given name of Irish language, Irish and Breton language, Breton origin, as well as a surname of Occitan language, Occitan origin. It is common in the English-speaking world. ...
– developed by Romain Brette and Dan Goodman at the
École Normale Supérieure École or Ecole may refer to: * an elementary school in the French educational stages normally followed by Secondary education in France, secondary education establishments (collège and lycée) * École (river), a tributary of the Seine flowing i ...
; *
GENESIS Genesis may refer to: Religion * Book of Genesis, the first book of the biblical scriptures of both Judaism and Christianity, describing the creation of the Earth and of humankind * Genesis creation narrative, the first several chapters of the Bo ...
(the GEneral NEural SImulation System) – developed in James Bower's laboratory at
Caltech The California Institute of Technology (branded as Caltech) is a private university, private research university in Pasadena, California, United States. The university is responsible for many modern scientific advancements and is among a small g ...
; *
NEST A nest is a structure built for certain animals to hold Egg (biology), eggs or young. Although nests are most closely associated with birds, members of all classes of vertebrates and some invertebrates construct nests. They may be composed of ...
– developed by the NEST Initiative; *
NEURON A neuron (American English), neurone (British English), or nerve cell, is an membrane potential#Cell excitability, excitable cell (biology), cell that fires electric signals called action potentials across a neural network (biology), neural net ...
– mainly developed by Michael Hines, John W. Moore and Ted Carnevale in
Yale University Yale University is a Private university, private Ivy League research university in New Haven, Connecticut, United States. Founded in 1701, Yale is the List of Colonial Colleges, third-oldest institution of higher education in the United Stat ...
and
Duke University Duke University is a Private university, private research university in Durham, North Carolina, United States. Founded by Methodists and Quakers in the present-day city of Trinity, North Carolina, Trinity in 1838, the school moved to Durham in 1 ...
; * RAVSim (Runtime Tool) – mainly developed by Sanaullah in Bielefeld University of Applied Sciences and Arts;


Hardware

Sutton and Barto proposed that future neuromorphic architectures will comprise billions of nanosynapses, which require a clear understanding of the accompanying physical mechanisms. Experimental systems based on ferroelectric tunnel junctions have been used to show that STDP can be harnessed from heterogeneous polarization switching. Through combined scanning probe imaging, electrical transport and atomic-scale molecular dynamics, conductance variations can be modelled by nucleation-dominated domain reversal. Simulations showed that arrays of ferroelectric nanosynapses can autonomously learn to recognize patterns in a predictable way, opening the path towards
unsupervised learning Unsupervised learning is a framework in machine learning where, in contrast to supervised learning, algorithms learn patterns exclusively from unlabeled data. Other frameworks in the spectrum of supervisions include weak- or semi-supervision, wh ...
.


Benchmarks

Classification capabilities of spiking networks trained according to unsupervised learning methods have been tested on benchmark datasets such as Iris, Wisconsin Breast Cancer or Statlog Landsat dataset. Various approaches to information encoding and network design have been used such as a 2-layer feedforward network for data clustering and classification. Based on Hopfield (1995) the authors implemented models of local receptive fields combining the properties of
radial basis functions In mathematics a radial basis function (RBF) is a real-valued function \varphi whose value depends only on the distance between the input and some fixed point, either the origin, so that \varphi(\mathbf) = \hat\varphi(\left\, \mathbf\right\, ), or ...
and spiking neurons to convert input signals having a floating-point representation into a spiking representation.


See also

*
CoDi CoDi is a cellular automaton (CA) model for spiking neural networks (SNNs). CoDi is an acronym for Collect and Distribute, referring to the signals and spikes in a neural network. CoDi uses a von Neumann neighborhood modified for a three-dime ...
*
Cognitive architecture A cognitive architecture is both a theory about the structure of the human mind and to a computational instantiation of such a theory used in the fields of artificial intelligence (AI) and computational cognitive science. These formalized models ...
*
Cognitive map A cognitive map is a type of mental representation used by an individual to order their personal store of information about their everyday or metaphorical spatial environment, and the relationship of its component parts. The concept was introduc ...
* Cognitive computer *
Computational neuroscience Computational neuroscience (also known as theoretical neuroscience or mathematical neuroscience) is a branch of  neuroscience which employs mathematics, computer science, theoretical analysis and abstractions of the brain to understand th ...
*
Neural coding Neural coding (or neural representation) is a neuroscience field concerned with characterising the hypothetical relationship between the Stimulus (physiology), stimulus and the neuronal responses, and the relationship among the Electrophysiology, e ...
*
Neural correlate The neural correlates of consciousness (NCC) are the minimal set of neuronal events and mechanisms sufficient for the occurrence of the mental states to which they are related. Neuroscientists use empirical approaches to discover neural correla ...
*
Neural decoding Neural decoding is a neuroscience field concerned with the hypothetical reconstruction of sensory and other stimuli from information that has already been encoded and represented in the brain by biological neural network, networks of neurons. Recon ...
*
Neuroethology Neuroethology is the evolutionary and comparative approach to the study of animal behavior and its underlying mechanistic control by the nervous system. It is an interdisciplinary science that combines both neuroscience (study of the nervous s ...
*
Neuroinformatics Neuroinformatics is the emergent field that combines informatics and neuroscience. Neuroinformatics is related with neuroscience data and information processing by artificial neural networks. There are three main directions where neuroinformatics ...
*
Models of neural computation Models of neural computation are attempts to elucidate, in an abstract and mathematical fashion, the core principles that underlie information processing in biological nervous systems, or functional components thereof. This article aims to provide ...
*
Motion perception Motion perception is the process of inferring the speed and direction of elements in a scene based on visual, vestibular and proprioceptive inputs. Although this process appears straightforward to most observers, it has proven to be a difficul ...
*
Systems neuroscience Systems neuroscience is a subdiscipline of neuroscience and systems biology that studies the structure and function of various neural circuits and systems that make up the central nervous system of an organism. Systems neuroscience encompasses a n ...


References

{{reflist Neural network architectures Computational statistics Artificial neural networks Articles containing video clips Computational neuroscience