Bidirectional Associative Memory (BAM) Network

From GM-RKB
Jump to navigation Jump to search

A Bidirectional Associative Memory (BAM) Network is a bidirectional recurrent neural network based on hetero-associative memory connections.



References

2018

  1. Kosko, B. (1988). "Bidirectional Associative Memories" (PDF). IEEE Transactions on Systems, Man, and Cybernetics. 18 (1).

2013

  • (Grossberg, 2013) ⇒ Stephen Grossberg (2013). "Adaptive_Bidirectional_Associative_Memory". In: Scholarpedia, 8(2):1888. doi:10.4249/scholarpedia.1888
    • QUOTE: Kosko (1987, 1988) adapted the Cohen-Grossberg model and Liapunov function (Cohen and Grossberg, 1983), which proved global convergence of STM, to define a system that combines STM and LTM and which also globally converges to a limit. The main trick was to observe how the symmetric connections in the Cohen-Grossberg equation (32) could be used to define symmetric LTM traces interacting reciprocally between two processing levels. An Additive Model BAM system is, accordingly, defined by:

      [math]\displaystyle{ \frac{d}{dt} x_i = -x_i + \sum_k f(y_k) z_{ki} + I_i \quad (37) }[/math]

      and

      [math]\displaystyle{ \frac{d}{dt} y_j = -y_j + \sum_m f(x_m) z_{mj} + J_i \quad (38) }[/math].

      A Shunting Model BAM can also be analogously defined. One type of learning law to which BAM methods apply is the passive decay associative law that was introduced in Grossberg (1967, 1968b, 1968c); see Fig.1 and Fig.3:

      [math]\displaystyle{ \frac{d}{dt} z_{ij} = -z_{ij} + f(x_i) f(x_j) \quad (39) }[/math]

      Kosko calls the equation in (39) the signal Hebb law, although it does not obey the property of monotonely increasing learned weights that Hebb (1949) ascribed to his law. Kosko (1988) wrote that: "When the BAM neurons are activated, the network quickly evolves to a stable state of two-pattern reverberation, or resonance". Indeed, another inspiration for BAM was Adaptive Resonance Theory, or ART.

1988