Origins
The Ising model of a neural network as a memory model was first proposed by William A. Little in 1974, which was acknowledged by Hopfield in his 1982 paper. Networks with continuous dynamics were developed by Hopfield in his 1984 paper. A major advance in memory storage capacity was developed by Krotov and Hopfield in 2016 through a change in network dynamics and energy function. This idea was further extended by Demircigil and collaborators in 2017. The continuous dynamics of large memory capacity models was developed in a series of papers between 2016 and 2020. Large memory storage capacity Hopfield Networks are now called Dense Associative Memories or modern Hopfield networks.Structure
The units in Hopfield nets are binary threshold units, i.e. the units only take on two different values for their states, and the value is determined by whether or not the unit's input exceeds its threshold . Discrete Hopfield nets describe relationships between binary (firing or not-firing) neurons . At a certain time, the state of the neural net is described by a vector , which records which neurons are firing in a binary word of bits. The interactions between neurons have units that usually take on values of 1 or −1, and this convention will be used throughout this article. However, other literature might use units that take values of 0 and 1. These interactions are "learned" via Hebb's law of association, such that, for a certain state but . (Note that the Hebbian learning rule takes the form when the units assume values in .) Once the network is trained, no longer evolve. If a new state of neurons is introduced to the neural network, the net acts on neurons such that * if * if where is the threshold value of the i'th neuron (often taken to be 0). In this way, Hopfield networks have the ability to "remember" states stored in the interaction matrix, because if a new state is subjected to the interaction matrix, each neuron will change until it matches the original state (see the Updates section below). The connections in a Hopfield net typically have the following restrictions: * (no unit has a connection with itself) * (connections are symmetric) The constraint that weights are symmetric guarantees that the energy function decreases monotonically while following the activation rules. A network with asymmetric weights may exhibit some periodic or chaotic behaviour; however, Hopfield found that this behavior is confined to relatively small parts of the phase space and does not impair the network's ability to act as a content-addressable associative memory system. Hopfield also modeled neural nets for continuous values, in which the electric output of each neuron is not binary but some value between 0 and 1. He found that this type of network was also able to store and reproduce memorized states. Notice that every pair of units ''i'' and ''j'' in a Hopfield network has a connection that is described by the connectivity weight . In this sense, the Hopfield network can be formally described as a complete undirected graph , where is a set of McCulloch–Pitts neurons and is a function that links pairs of units to a real value, the connectivity weight.Updating
Updating one unit (node in the graph simulating the artificial neuron) in the Hopfield network is performed using the following rule: , If the Lagrangian functions, or equivalently the activation functions, are chosen in such a way that the Hessians for each layer are positive semi-definite and the overall energy is bounded from below, this system is guaranteed to converge to a fixed point attractor state. The temporal derivative of this energy function is given by \tau_A \sum\limits_^ \frac \frac \frac \leq 0, Thus, the hierarchical layered network is indeed an attractor network with the global energy function. This network is described by a hierarchical set of synaptic weights that can be learned for each specific problem.See also
* Associative memory (disambiguation) * Autoassociative memory * Boltzmann machine – like a Hopfield net but uses annealed Gibbs sampling instead of gradient descent * Dynamical systems model of cognition * Ising model * Hebbian theoryReferences
* * * * * *External links
*