Jordan Network

From GM-RKB
Jump to navigation Jump to search

A Jordan Network is a Simple Recurrent Network in which activations occur at the output layer, not at a hidden layer.



References

2018

  • (Wikipedia, 2018) ⇒ https://en.wikipedia.org/wiki/Recurrent_neural_network#Elman_networks_and_Jordan_networks Retrieved:2018-3-4.
    • Jordan networks are similar to Elman networks. The context units are fed from the output layer instead of the hidden layer. The context units in a Jordan network are also referred to as the state layer. They have a recurrent connection to themselves.

      Elman and Jordan networks are also known as "simple recurrent networks" (SRN).

      Elman network [1]:

      [math]\displaystyle{ \begin{align} h_t &= \sigma_h(W_{h} x_t + U_{h} h_{t-1} + b_h) \\ y_t &= \sigma_y(W_{y} h_t + b_y) \end{align} }[/math]

      Jordan network [2]:

      [math]\displaystyle{ \begin{align} h_t &= \sigma_h(W_{h} x_t + U_{h} y_{t-1} + b_h) \\ y_t &= \sigma_y(W_{y} h_t + b_y) \end{align} }[/math]

      Variables and functions

      • [math]\displaystyle{ x_t }[/math] : input vector
      • [math]\displaystyle{ h_t }[/math] : hidden layer vector
      • [math]\displaystyle{ y_t }[/math] : output vector
      • [math]\displaystyle{ W }[/math] , [math]\displaystyle{ U }[/math] and [math]\displaystyle{ b }[/math] : parameter matrices and vector
      • [math]\displaystyle{ \sigma_h }[/math] and [math]\displaystyle{ \sigma_y }[/math] : Activation functions
  1. Elman, Jeffrey L. (1990). “Finding Structure in Time". Cognitive Science. 14 (2): 179–211. doi:10.1016/0364-0213(90)90002-E.
  2. Jordan, Michael I. (1997-01-01). “Serial Order: A Parallel Distributed Processing Approach". Advances in Psychology. Neural-Network Models of Cognition. 121: 471–495. doi:10.1016/s0166-4115(97)80111-2. ISBN 9780444819314.

1997