Bidirectional associative memory (BAM) is a type of
recurrent neural network
A recurrent neural network (RNN) is a class of artificial neural networks where connections between nodes can create a cycle, allowing output from some nodes to affect subsequent input to the same nodes. This allows it to exhibit temporal dynamic ...
. BAM was introduced by
Bart Kosko
Bart Andrew Kosko (born February 7, 1960) is a writer and professor of electrical engineering and law at the University of Southern California (USC). He is a researcher and popularizer of fuzzy logic, neural networks, and noise, and author of sev ...
in 1988.
There are two types of associative memory,
auto-associative and hetero-associative. BAM is hetero-associative, meaning given a pattern it can return another pattern which is potentially of a different size. It is similar to the
Hopfield network
A Hopfield network (or Ising model of a neural network or Ising–Lenz–Little model) is a form of recurrent artificial neural network and a type of spin glass system popularised by John Hopfield in 1982 as described earlier by Little in 1974 b ...
in that they are both forms of
associative
In mathematics, the associative property is a property of some binary operations, which means that rearranging the parentheses in an expression will not change the result. In propositional logic, associativity is a valid rule of replacement ...
memory
Memory is the faculty of the mind by which data or information is encoded, stored, and retrieved when needed. It is the retention of information over time for the purpose of influencing future action. If past events could not be remembered ...
. However, Hopfield nets return patterns of the same size.
It is said to be bi-directional as it can respond to inputs from either the input or the output layer.
Topology
A BAM contains two layers of
neuron
A neuron, neurone, or nerve cell is an membrane potential#Cell excitability, electrically excitable cell (biology), cell that communicates with other cells via specialized connections called synapses. The neuron is the main component of nervous ...
s, which we shall denote X and Y. Layers X and Y are fully connected to each other. Once the weights have been established, input into layer X presents the pattern in layer Y, and vice versa.
The layers can be connected in both directions (bidirectional) with the result the weight matrix sent from the X layer to the Y layer is
and the weight matrix for signals sent from the Y layer to the X layer is
. Thus, the weight matrix is calculated in both directions.
Procedure
Learning
Imagine we wish to store two associations, A1:B1 and A2:B2.
* A1 = (1, 0, 1, 0, 1, 0), B1 = (1, 1, 0, 0)
* A2 = (1, 1, 1, 0, 0, 0), B2 = (1, 0, 1, 0)
These are then transformed into the bipolar forms:
* X1 = (1, -1, 1, -1, 1, -1), Y1 = (1, 1, -1, -1)
* X2 = (1, 1, 1, -1, -1, -1), Y2 = (1, -1, 1, -1)
From there, we calculate
where
denotes the transpose.
So,