A Hopfield network (or associative memory) is a form of
recurrent neural network
Recurrent neural networks (RNNs) are a class of artificial neural networks designed for processing sequential data, such as text, speech, and time series, where the order of elements is important. Unlike feedforward neural networks, which proces ...
, or a
spin glass system, that can serve as a
content-addressable memory
Content-addressable memory (CAM) is a special type of computer memory used in certain very-high-speed searching applications. It is also known as associative memory or associative storage and compares input search data against a table of stored ...
. The Hopfield network, named for
John Hopfield, consists of a single layer of neurons, where each neuron is connected to every other neuron except itself. These connections are bidirectional and symmetric, meaning the weight of the connection from neuron ''i'' to neuron ''j'' is the same as the weight from neuron ''j'' to neuron ''i''. Patterns are associatively recalled by fixing certain inputs, and dynamically evolve the network to minimize an energy function, towards local energy minimum states that correspond to stored patterns. Patterns are associatively learned (or "stored") by a
Hebbian learning
Hebbian theory is a neuropsychological theory claiming that an increase in synaptic efficacy arises from a presynaptic cell's repeated and persistent stimulation of a postsynaptic cell. It is an attempt to explain synaptic plasticity, the adaptat ...
algorithm.
One of the key features of Hopfield networks is their ability to recover complete patterns from partial or noisy inputs, making them robust in the face of incomplete or corrupted data. Their connection to statistical mechanics, recurrent networks, and human cognitive psychology has led to their application in various fields, including
physics
Physics is the scientific study of matter, its Elementary particle, fundamental constituents, its motion and behavior through space and time, and the related entities of energy and force. "Physical science is that department of knowledge whi ...
,
psychology
Psychology is the scientific study of mind and behavior. Its subject matter includes the behavior of humans and nonhumans, both consciousness, conscious and Unconscious mind, unconscious phenomena, and mental processes such as thoughts, feel ...
,
neuroscience
Neuroscience is the scientific study of the nervous system (the brain, spinal cord, and peripheral nervous system), its functions, and its disorders. It is a multidisciplinary science that combines physiology, anatomy, molecular biology, ...
, and machine learning theory and practice.
History
One origin of associative memory is human
cognitive psychology
Cognitive psychology is the scientific study of human mental processes such as attention, language use, memory, perception, problem solving, creativity, and reasoning.
Cognitive psychology originated in the 1960s in a break from behaviorism, whi ...
, specifically the
associative memory.
Frank Rosenblatt
Frank Rosenblatt (July 11, 1928July 11, 1971) was an American psychologist notable in the field of artificial intelligence. He is sometimes called the father of deep learning for his pioneering work on artificial neural networks.
Life and career
...
studied "close-loop cross-coupled perceptrons", which are 3-layered
perceptron
In machine learning, the perceptron is an algorithm for supervised classification, supervised learning of binary classification, binary classifiers. A binary classifier is a function that can decide whether or not an input, represented by a vect ...
networks whose middle layer contains recurrent connections that change by a
Hebbian learning
Hebbian theory is a neuropsychological theory claiming that an increase in synaptic efficacy arises from a presynaptic cell's repeated and persistent stimulation of a postsynaptic cell. It is an attempt to explain synaptic plasticity, the adaptat ...
rule.
Another model of associative memory is where the output does not loop back to the input. W. K. Taylor proposed such a model trained by Hebbian learning in 1956.
Karl Steinbuch, who wanted to understand learning, and inspired by watching his children learn, published the
Lernmatrix in 1961. It was translated to English in 1963. Similar research was done with the ''correlogram'' of D. J. Willshaw et al. in 1969.
Teuvo Kohonen trained an associative memory by gradient descent in 1974.

Another origin of associative memory was
statistical mechanics
In physics, statistical mechanics is a mathematical framework that applies statistical methods and probability theory to large assemblies of microscopic entities. Sometimes called statistical physics or statistical thermodynamics, its applicati ...
. The Ising model was published in 1920s as a model of magnetism, however it studied the thermal equilibrium, which does not change with time.
Roy J. Glauber in 1963 studied the Ising model evolving in time, as a process towards thermal equilibrium (
Glauber dynamics), adding in the component of time.
The second component to be added was adaptation to stimulus. Described independently by Kaoru Nakano in 1971
and
Shun'ichi Amari in 1972,
they proposed to modify the weights of an Ising model by
Hebbian learning
Hebbian theory is a neuropsychological theory claiming that an increase in synaptic efficacy arises from a presynaptic cell's repeated and persistent stimulation of a postsynaptic cell. It is an attempt to explain synaptic plasticity, the adaptat ...
rule as a model of associative memory. The same idea was published by in 1974,
who was acknowledged by Hopfield in his 1982 paper.
See Carpenter (1989) and Cowan (1990) for a technical description of some of these early works in associative memory.
The
Sherrington–Kirkpatrick model
In condensed matter physics, a spin glass is a magnetic state characterized by randomness, besides cooperative behavior in freezing of Spin (physics), spins at a temperature called the "freezing temperature," ''T''f. In Ferromagnetism, ferroma ...
of spin glass, published in 1975, is the Hopfield network with random initialization. Sherrington and Kirkpatrick found that it is highly likely for the energy function of the SK model to have many local minima. In the 1982 paper, Hopfield applied this recently developed theory to study the Hopfield network with binary activation functions.
In a 1984 paper he extended this to continuous activation functions.
It became a standard model for the study of neural networks through statistical mechanics.
A major advance in memory storage capacity was developed by Dimitry Krotov and Hopfield in 2016
through a change in network dynamics and energy function. This idea was further extended by Demircigil and collaborators in 2017.
The continuous dynamics of large memory capacity models was developed in a series of papers between 2016 and 2020.
Large memory storage capacity Hopfield Networks are now called Dense Associative Memories or
modern Hopfield networks.
In 2024, John J. Hopfield and
Geoffrey E. Hinton were awarded the
Nobel Prize in Physics
The Nobel Prize in Physics () is an annual award given by the Royal Swedish Academy of Sciences for those who have made the most outstanding contributions to mankind in the field of physics. It is one of the five Nobel Prizes established by the ...
for their foundational contributions to machine learning, such as the Hopfield network.
Structure
The units in Hopfield nets are binary threshold units, i.e. the units only take on two different values for their states, and the value is determined by whether or not the unit's input exceeds its threshold
. Discrete Hopfield nets describe relationships between binary (firing or not-firing) neurons
.
At a certain time, the state of the neural net is described by a vector
, which records which neurons are firing in a binary word of
bits.
The interactions
between neurons have units that usually take on values of 1 or −1, and this convention will be used throughout this article. However, other literature might use units that take values of 0 and 1. These interactions are "learned" via
Hebb's law of association, such that, for a certain state
and distinct nodes
but
.
(Note that the Hebbian learning rule takes the form
when the units assume values in
.)
Once the network is trained,
no longer evolve. If a new state of neurons
is introduced to the neural network, the net acts on neurons such that
*
if
*
if
where
is the threshold value of the i'th neuron (often taken to be 0). In this way, Hopfield networks have the ability to "remember" states stored in the interaction matrix, because if a new state
is subjected to the interaction matrix, each neuron will change until it matches the original state
(see the Updates section below).
The connections in a Hopfield net typically have the following restrictions:
*
(no unit has a connection with itself)
*
(connections are symmetric)
The constraint that weights are symmetric guarantees that the energy function decreases monotonically while following the activation rules.
A network with asymmetric weights may exhibit some periodic or chaotic behaviour; however, Hopfield found that this behavior is confined to relatively small parts of the phase space and does not impair the network's ability to act as a content-addressable associative memory system.
Hopfield also modeled neural nets for continuous values, in which the electric output of each neuron is not binary but some value between 0 and 1.
He found that this type of network was also able to store and reproduce memorized states.
Notice that every pair of units ''i'' and ''j'' in a Hopfield network has a connection that is described by the connectivity weight
. In this sense, the Hopfield network can be formally described as a complete undirected graph
, where
is a set of
McCulloch–Pitts neurons and
is a function that links pairs of units to a real value, the connectivity weight.
Updating
Updating one unit (node in the graph simulating the artificial neuron) in the Hopfield network is performed using the following rule:
, 17 If the Lagrangian functions, or equivalently the activation functions, are chosen in such a way that the Hessians for each layer are positive semi-definite and the overall energy is bounded from below, this system is guaranteed to converge to a fixed point attractor state. The temporal derivative of this energy function is given by
\tau_A \sum\limits_^ \frac \frac \frac \leq 0, 18 Thus, the hierarchical layered network is indeed an attractor network with the global energy function. This network is described by a hierarchical set of synaptic weights that can be learned for each specific problem.
See also
*
Associative memory (disambiguation) Associative memory may refer to:
* Associative memory (psychology), the ability to learn and remember the relationship between unrelated items
* Associative storage, or content-addressable memory, a type of computer memory used in certain very high ...
*
Autoassociative memory Autoassociative memory, also known as auto-association memory or an autoassociation network, is any type of memory that is able to retrieve a piece of data from only a tiny sample of itself. They are very effective in de-noising or removing interfer ...
*
Boltzmann machine
A Boltzmann machine (also called Sherrington–Kirkpatrick model with external field or stochastic Ising model), named after Ludwig Boltzmann, is a spin glass, spin-glass model with an external field, i.e., a Spin glass#Sherrington–Kirkpatrick m ...
– like a Hopfield net but uses annealed Gibbs sampling instead of gradient descent
*
Dynamical systems model of cognition
*
Ising model
The Ising model (or Lenz–Ising model), named after the physicists Ernst Ising and Wilhelm Lenz, is a mathematical models in physics, mathematical model of ferromagnetism in statistical mechanics. The model consists of discrete variables that r ...
*
Hebbian theory
Hebbian theory is a neuropsychological theory claiming that an increase in synaptic efficacy arises from a presynaptic cell's repeated and persistent stimulation of a postsynaptic cell. It is an attempt to explain synaptic plasticity, the adaptat ...
References
*
*
*
*
*
*
External links
*
Hopfield Network Javascript – Hopfield Neural Network JAVA Applet
*
*
*
{{Stochastic processes
Neural network architectures