HOME

TheInfoList



OR:

Reservoir computing is a framework for computation derived from
recurrent neural network A recurrent neural network (RNN) is a class of artificial neural networks where connections between nodes can create a cycle, allowing output from some nodes to affect subsequent input to the same nodes. This allows it to exhibit temporal dynamic ...
theory that maps input signals into higher dimensional computational spaces through the dynamics of a fixed, non-linear system called a reservoir. After the input signal is fed into the reservoir, which is treated as a "black box," a simple readout mechanism is trained to read the state of the reservoir and map it to the desired output. The first key benefit of this framework is that training is performed only at the readout stage, as the reservoir dynamics are fixed. The second is that the computational power of naturally available systems, both classical and quantum mechanical, can be used to reduce the effective computational cost.


History

The concept of reservoir computing stems from the use of recursive connections within
neural network A neural network is a network or neural circuit, circuit of biological neurons, or, in a modern sense, an artificial neural network, composed of artificial neurons or nodes. Thus, a neural network is either a biological neural network, made up ...
s to create a complex dynamical system. Schrauwen, Benjamin, David Verstraeten, and Jan Van Campenhout. "An overview of reservoir computing: theory, applications, and implementations." Proceedings of the European Symposium on Artificial Neural Networks ESANN 2007, pp. 471–482. It is a generalisation of earlier neural network architectures such as recurrent neural networks, liquid-state machines and echo-state networks. Reservoir computing also extends to physical systems that are not networks in the classical sense, but rather continuous systems in space and/or time: e.g. a literal "bucket of water" can serve as a reservoir that performs computations on inputs given as perturbations of the surface. The resultant complexity of such recurrent neural networks was found to be useful in solving a variety of problems including language processing and dynamic system modeling. However, training of recurrent neural networks is challenging and computationally expensive. Reservoir computing reduces those training-related challenges by fixing the dynamics of the reservoir and only training the linear output layer. A large variety of nonlinear dynamical systems can serve as a reservoir that performs computations. In recent years semiconductor lasers have attracted considerable interest as computation can be fast and energy efficient compared to electrical components. Recent advances in both AI and quantum information theory have given rise to the concept of quantum neural networks. These hold promise in quantum information processing, which is challenging to classical networks, but can also find application in solving classical problems. In 2018, a physical realization of a quantum reservoir computing architecture was demonstrated in the form of nuclear spins within a molecular solid. However, the nuclear spin experiments in did not demonstrate quantum reservoir computing per se as they did not involve processing of sequential data. Rather the data were vector inputs, which makes this more accurately a demonstration of quantum implementation of a random kitchen sink algorithm (also going by the name of extreme learning machines in some communities). In 2019, another possible implementation of quantum reservoir processors was proposed in the form of two-dimensional fermionic lattices. In 2020, realization of reservoir computing on gate-based quantum computers was proposed and demonstrated on cloud-based IBM superconducting near-term quantum computers. Reservoir computers have been used for
time-series In mathematics, a time series is a series of data points indexed (or listed or graphed) in time order. Most commonly, a time series is a sequence taken at successive equally spaced points in time. Thus it is a sequence of discrete-time data. ...
analysis purposes. In particular, some of their usages involve
chaotic Chaotic was originally a Danish trading card game. It expanded to an online game in America which then became a television program based on the game. The program was able to be seen on 4Kids TV (Fox affiliates, nationwide), Jetix, The CW4Kids ...
time-series In mathematics, a time series is a series of data points indexed (or listed or graphed) in time order. Most commonly, a time series is a sequence taken at successive equally spaced points in time. Thus it is a sequence of discrete-time data. ...
prediction, separation of
chaotic Chaotic was originally a Danish trading card game. It expanded to an online game in America which then became a television program based on the game. The program was able to be seen on 4Kids TV (Fox affiliates, nationwide), Jetix, The CW4Kids ...
signals, and link inference of
networks Network, networking and networked may refer to: Science and technology * Network theory, the study of graphs as a representation of relations between discrete objects * Network science, an academic field that studies complex networks Mathematics ...
from their dynamics.


Classical reservoir computing


Reservoir

The 'reservoir' in reservoir computing is the internal structure of the computer, and must have two properties: it must be made up of individual, non-linear units, and it must be capable of storing information. The non-linearity describes the response of each unit to input, which is what allows reservoir computers to solve complex problems. Reservoirs are able to store information by connecting the units in recurrent loops, where the previous input affects the next response. The change in reaction due to the past allows the computers to be trained to complete specific tasks. Reservoirs can be virtual or physical. Virtual reservoirs are typically randomly generated and are designed like neural networks. Virtual reservoirs can be designed to have non-linearity and recurrent loops, but, unlike neural networks, the connections between units are randomized and remain unchanged throughout computation. Physical reservoirs are possible because of the inherent non-linearity of certain natural systems. The interaction between ripples on the surface of water contains the nonlinear dynamics required in reservoir creation, and a pattern recognition RC was developed by first inputting ripples with electric motors then recording and analyzing the ripples in the readout.


Readout

The readout is a neural network layer that performs a linear transformation on the output of the reservoir. The weights of the readout layer are trained by analyzing the spatiotemporal patterns of the reservoir after excitation by known inputs, and by utilizing a training method such as a
linear regression In statistics, linear regression is a linear approach for modelling the relationship between a scalar response and one or more explanatory variables (also known as dependent and independent variables). The case of one explanatory variable is ...
or a
Ridge regression Ridge regression is a method of estimating the coefficients of multiple- regression models in scenarios where the independent variables are highly correlated. It has been used in many fields including econometrics, chemistry, and engineering. Also ...
. As its implementation depends on spatiotemporal reservoir patterns, the details of readout methods are tailored to each type of reservoir. For example, the readout for a reservoir computer using a container of liquid as its reservoir might entail observing spatiotemporal patterns on the surface of the liquid.


Types


Context reverberation network

An early example of reservoir computing was the context reverberation network. In this architecture, an input layer feeds into a high dimensional dynamical system which is read out by a trainable single-layer
perceptron In machine learning, the perceptron (or McCulloch-Pitts neuron) is an algorithm for supervised classification, supervised learning of binary classification, binary classifiers. A binary classifier is a function which can decide whether or not an ...
. Two kinds of dynamical system were described: a recurrent neural network with fixed random weights, and a continuous
reaction–diffusion system Reaction–diffusion systems are mathematical models which correspond to several physical phenomena. The most common is the change in space and time of the concentration of one or more chemical substances: local chemical reactions in which the s ...
inspired by
Alan Turing Alan Mathison Turing (; 23 June 1912 – 7 June 1954) was an English mathematician, computer scientist, logician, cryptanalyst, philosopher, and theoretical biologist. Turing was highly influential in the development of theoretical c ...
’s model of
morphogenesis Morphogenesis (from the Greek ''morphê'' shape and ''genesis'' creation, literally "the generation of form") is the biological process that causes a cell, tissue or organism to develop its shape. It is one of three fundamental aspects of deve ...
. At the trainable layer, the perceptron associates current inputs with the signals that reverberate in the dynamical system; the latter were said to provide a dynamic "context" for the inputs. In the language of later work, the reaction–diffusion system served as the reservoir.


Echo state network

The Tree Echo State Network (TreeESN) model represents a generalization of the reservoir computing framework to tree structured data.


Liquid-state machine

Chaotic Liquid State Machine The liquid (i.e. reservoir) of a Chaotic Liquid State Machine (CLSM), or chaotic reservoir, is made from chaotic spiking neurons but which stabilize their activity by settling to a single hypothesis that describes the trained inputs of the machine. This is in contrast to general types of reservoirs that don’t stabilize. The liquid stabilization occurs via synaptic plasticity and chaos control that govern neural connections inside the liquid. CLSM showed promising results in learning sensitive time series data.


Nonlinear transient computation

This type of information processing is most relevant when time-dependent input signals depart from the mechanism’s internal dynamics. These departures cause transients or temporary altercations which are represented in the device’s output.


Deep reservoir computing

The extension of the reservoir computing framework towards Deep Learning, with the introduction of Deep Reservoir Computing and of the Deep Echo State Network (DeepESN) model allows to develop efficiently trained models for hierarchical processing of temporal data, at the same time enabling the investigation on the inherent role of layered composition in
recurrent neural network A recurrent neural network (RNN) is a class of artificial neural networks where connections between nodes can create a cycle, allowing output from some nodes to affect subsequent input to the same nodes. This allows it to exhibit temporal dynamic ...
s.


Quantum reservoir computing

Quantum reservoir computing may use the nonlinear nature of quantum mechanical interactions or processes to form the characteristic nonlinear reservoirs but may also be done with linear reservoirs when the injection of the input to the reservoir creates the nonlinearity. The marriage of machine learning and quantum devices is leading to the emergence of quantum neuromorphic computing as a new research area.


Types


Gaussian states of interacting quantum harmonic oscillators

Gaussian states are a paradigmatic class of states of continuous variable quantum systems. Although they can nowadays be created and manipulated in, e.g, state-of-the-art optical platforms, naturally robust to
decoherence Quantum decoherence is the loss of Coherence (physics)#Quantum coherence, quantum coherence. In quantum mechanics, particles such as electrons are described by a wave function, a mathematical representation of the quantum state of a system; a p ...
, it is well-known that they are not sufficient for, e.g., universal
quantum computing Quantum computing is a type of computation whose operations can harness the phenomena of quantum mechanics, such as superposition, interference, and entanglement. Devices that perform quantum computations are known as quantum computers. Thou ...
because transformations that preserve the Gaussian nature of a state are linear. Normally, linear dynamics would not be sufficient for nontrivial reservoir computing either. It is nevertheless possible to harness such dynamics for reservoir computing purposes by considering a network of interacting
quantum harmonic oscillator 量子調和振動子 は、調和振動子, 古典調和振動子 の 量子力学, 量子力学 類似物です。任意の滑らかな ポテンシャル エネルギー, ポテンシャル は通常、安定した 平衡点 の近くで � ...
s and injecting the input by periodical state resets of a subset of the oscillators. With a suitable choice of how the states of this subset of oscillators depends on the input, the observables of the rest of the oscillators can become nonlinear functions of the input suitable for reservoir computing; indeed, thanks to the properties of these functions, even universal reservoir computing becomes possible by combining the observables with a polynomial readout function. In principle, such reservoir computers could be implemented with controlled multimode optical parametric processes, however efficient extraction of the output from the system is challenging especially in the quantum regime where measurement back-action must be taken into account.


2-D quantum dot lattices

In this architecture, randomized coupling between lattice sites grants the reservoir the “black box” property inherent to reservoir processors. The reservoir is then excited, which acts as the input, by an incident
optical field An electromagnetic field (also EM field or EMF) is a classical (i.e. non-quantum) field produced by (stationary or moving) electric charges. It is the field described by classical electrodynamics (a classical field theory) and is the classical ...
. Readout occurs in the form of occupational numbers of lattice sites, which are naturally nonlinear functions of the input.


Nuclear spins in a molecular solid

In this architecture, quantum mechanical coupling between spins of neighboring atoms within the molecular solid provides the non-linearity required to create the higher-dimensional computational space. The reservoir is then excited by radiofrequency
electromagnetic radiation In physics, electromagnetic radiation (EMR) consists of waves of the electromagnetic (EM) field, which propagate through space and carry momentum and electromagnetic radiant energy. It includes radio waves, microwaves, infrared, (visible ...
tuned to the
resonance Resonance describes the phenomenon of increased amplitude that occurs when the frequency of an applied periodic force (or a Fourier component of it) is equal or close to a natural frequency of the system on which it acts. When an oscillat ...
frequencies of relevant nuclear spins. Readout occurs by measuring the nuclear spin states.


Reservoir computing on gate-based near-term superconducting quantum computers

The most prevalent model of quantum computing is the gate-based model where quantum computation is performed by sequential applications of unitary quantum gates on qubits of a quantum computer. A theory for the implementation of reservoir computing on a gate-based quantum computer with proof-of-principle demonstrations on a number of IBM superconducting noisy intermediate-scale quantum (NISQ) computers
John Preskill John Phillip Preskill (born January 19, 1953) is an American theoretical physicist and the Richard P. Feynman Professor of Theoretical Physics at the California Institute of Technology, where he is also the Director of the Institute for Quantum In ...
. "Quantum Computing in the NISQ era and beyond." Quantum 2,79 (2018)
has been reported in.


See also

* Deep learning * Extreme learning machines *
Unconventional computing Unconventional computing is computing by any of a wide range of new or unusual methods. It is also known as alternative computing. The term ''unconventional computation'' was coined by Cristian S. Calude and John Casti and used at the First Inte ...


References

{{Reflist, 30em


Further reading


Reservoir Computing using delay systems
Nature Communications 2011

Scientific Reports February 2012
Optoelectronic Reservoir Computing
Optics Express 2012

Nature Communications 2013
Memristor Models for Machine learning
Neural Computation 201
arxiv
Artificial neural networks