A Hopfield network (or Ising model of a neural network or Ising–Lenz–Little model) is a form of
recurrent artificial neural network
Artificial neural networks (ANNs), usually simply called neural networks (NNs) or neural nets, are computing systems inspired by the biological neural networks that constitute animal brains.
An ANN is based on a collection of connected unit ...
and a type of
spin glass system popularised by
John Hopfield
John Joseph Hopfield (born July 15, 1933) is an American scientist most widely known for his invention of an associative neural network in 1982. It is now more commonly known as the Hopfield network.
Biography
Hopfield was born in 1933 to Po ...
in 1982
as described earlier by Little in 1974
based on
Ernst Ising's work with
Wilhelm Lenz
Wilhelm Lenz (February 8, 1888 in Frankfurt am Main – April 30, 1957 in Hamburg) was a German physicist, most notable for his invention of the Ising model and for his application of the Laplace–Runge–Lenz vector to the old quantum mechani ...
on the
Ising model
The Ising model () (or Lenz-Ising model or Ising-Lenz model), named after the physicists Ernst Ising and Wilhelm Lenz, is a mathematical model of ferromagnetism in statistical mechanics. The model consists of discrete variables that represent ...
.
Hopfield networks serve as
content-addressable ("associative") memory systems with
binary threshold
nodes, or with continuous variables.
Hopfield networks also provide a model for understanding human memory.
Origins
The
Ising model
The Ising model () (or Lenz-Ising model or Ising-Lenz model), named after the physicists Ernst Ising and Wilhelm Lenz, is a mathematical model of ferromagnetism in statistical mechanics. The model consists of discrete variables that represent ...
of a neural network as a memory model was first proposed by
William A. Little in 1974,
which was acknowledged by Hopfield in his 1982 paper.
Networks with continuous dynamics were developed by Hopfield in his 1984 paper.
A major advance in memory storage capacity was developed by Krotov and Hopfield in 2016
through a change in network dynamics and energy function. This idea was further extended by Demircigil and collaborators in 2017.
The continuous dynamics of large memory capacity models was developed in a series of papers between 2016 and 2020.
Large memory storage capacity Hopfield Networks are now called Dense Associative Memories or
modern Hopfield network
Hopfield networks are recurrent neural networks with dynamical trajectories converging to fixed point attractor states and described by an energy function. The state of each model neuron i is defined by a time-dependent variable V_i, which can ...
s.
Structure
The units in Hopfield nets are binary threshold units, i.e. the units only take on two different values for their states, and the value is determined by whether or not the unit's input exceeds its threshold
. Discrete Hopfield nets describe relationships between binary (firing or not-firing) neurons
.
At a certain time, the state of the neural net is described by a vector
, which records which neurons are firing in a binary word of
bits.
The interactions
between neurons have units that usually take on values of 1 or −1, and this convention will be used throughout this article. However, other literature might use units that take values of 0 and 1. These interactions are "learned" via
Hebb's law of association, such that, for a certain state
but
.
(Note that the Hebbian learning rule takes the form
when the units assume values in
.)
Once the network is trained,
no longer evolve. If a new state of neurons
is introduced to the neural network, the net acts on neurons such that
*
if
*
if
where
is the threshold value of the i'th neuron (often taken to be 0). In this way, Hopfield networks have the ability to "remember" states stored in the interaction matrix, because if a new state
is subjected to the interaction matrix, each neuron will change until it matches the original state
(see the Updates section below).
The connections in a Hopfield net typically have the following restrictions:
*
(no unit has a connection with itself)
*
(connections are symmetric)
The constraint that weights are symmetric guarantees that the energy function decreases monotonically while following the activation rules.
A network with asymmetric weights may exhibit some periodic or chaotic behaviour; however, Hopfield found that this behavior is confined to relatively small parts of the phase space and does not impair the network's ability to act as a content-addressable associative memory system.
Hopfield also modeled neural nets for continuous values, in which the electric output of each neuron is not binary but some value between 0 and 1.
He found that this type of network was also able to store and reproduce memorized states.
Notice that every pair of units ''i'' and ''j'' in a Hopfield network has a connection that is described by the connectivity weight
. In this sense, the Hopfield network can be formally described as a complete undirected graph
, where
is a set of
McCulloch–Pitts neurons and
is a function that links pairs of units to a real value, the connectivity weight.
Updating
Updating one unit (node in the graph simulating the artificial neuron) in the Hopfield network is performed using the following rule:
, If the Lagrangian functions, or equivalently the activation functions, are chosen in such a way that the Hessians for each layer are positive semi-definite and the overall energy is bounded from below, this system is guaranteed to converge to a fixed point attractor state. The temporal derivative of this energy function is given by
\tau_A \sum\limits_^ \frac \frac \frac \leq 0, Thus, the hierarchical layered network is indeed an attractor network with the global energy function. This network is described by a hierarchical set of synaptic weights that can be learned for each specific problem.
See also
*
Associative memory (disambiguation) Associative memory may refer to:
* Associative memory (psychology), the ability to learn and remember the relationship between unrelated items
* Associative storage, or content-addressable memory, a type of computer memory used in certain very hig ...
*
Autoassociative memory Autoassociative memory, also known as auto-association memory or an autoassociation network, is any type of memory that is able to retrieve a piece of data from only a tiny sample of itself. They are very effective in de-noising or removing interfer ...
*
Boltzmann machine – like a Hopfield net but uses annealed Gibbs sampling instead of gradient descent
*
Dynamical systems model of cognition
*
Ising model
The Ising model () (or Lenz-Ising model or Ising-Lenz model), named after the physicists Ernst Ising and Wilhelm Lenz, is a mathematical model of ferromagnetism in statistical mechanics. The model consists of discrete variables that represent ...
*
Hebbian theory
References
*
*
*
*
*
*
External links
*
Hopfield Network Javascript– Hopfield Neural Network JAVA Applet
*
*
Neural Lab Graphical Interface– Hopfield Neural Network graphical interface (Python & gtk)
{{Stochastic processes
Neural network architectures