ANN – Bidirectional Associative Memory (BAM)


Bidirectional Associative Memory (BAM) is a supervised learning model in Artificial Neural Network. This is hetero-associative memory, for an input pattern, it returns another pattern which is potentially of a different size. This phenomenon is very similar to the human brain. Human memory is necessarily associative. It uses a chain of mental associations to recover a lost memory like associations of faces with names, in exam questions with answers, etc.
In such memory associations for one type of object with another, a Recurrent Neural Network (RNN) is needed to receive a pattern of one set of neurons as an input and generate a related, but different, output pattern of another set of neurons.

Why BAM is required?
The main objective to introduce such a network model is to store hetero-associative pattern pairs.
This is used to retrieve a pattern given a noisy or incomplete pattern.

BAM Architecture:
When BAM accepts an input of n-dimensional vector X from set A then the model recalls m-dimensional vector Y from set B. Similarly when Y is treated as input, the BAM recalls X.

Algorithm:

  1. Storage (Learning): In this learning step of BAM, weight matrix is calculated between M pairs of patterns (fundamental memories) are stored in the synaptic weights of the network following the equation $W=\sum_{m=1}^{M} X_{m} Y_{m}^{T}$
  2. Testing: We have to check that the BAM recalls perfectly $Y_{m}$ for corresponding $X_{m}$ and recalls $X_{m}$ for corresponding $Y_{m}$. Using,

        \[Y_{m}=\operatorname{sign}\left(W^{T} X_{m}\right), \quad m=1.2, \ldots, M\]\[X_{m}=\operatorname{sign}\left(W Y_{m}\right), \quad m=1.2, \ldots, M\]



    All pairs should be recalled accordingly.

  3. Retrieval: For an unknown vector X (a corrupted or incomplete version of a pattern from set A or B) to the BAM and retrieve a previously stored association:
    X \neq X_{m}, \quad m=\mathbf{1}, \mathbf{2}, \ldots, M

    • Initialize the BAM:

           \[X(0)=X, \quad p=0\]

    • Calculate the BAM output at iteration $p$:

           \[Y(p)=\operatorname{sign}\left[W^{T} X(p)\right]\]

    • Update the input vector $X(p)$:

           \[X(p+1)=\operatorname{sign}[W Y(p)]\]

    • Repeat the iteration until convergence, when input and output remain unchanged.

Limitations of BAM:

  • Storage capacity of the BAM: In the BAM, stored number of associations should not be exceeded the number of neurons in the smaller layer.
  • Incorrect convergence: Always the closest association may not be produced by BAM.
My Personal Notes arrow_drop_up

Check out this Author's contributed articles.

If you like GeeksforGeeks and would like to contribute, you can also write an article using contribute.geeksforgeeks.org or mail your article to contribute@geeksforgeeks.org. See your article appearing on the GeeksforGeeks main page and help other Geeks.

Please Improve this article if you find anything incorrect by clicking on the "Improve Article" button below.


Article Tags :
Practice Tags :


Be the First to upvote.


Please write to us at contribute@geeksforgeeks.org to report any issue with the above content.