Gated Recurrent Neural Network

From GM-RKB
(Redirected from GRU RNN)
Jump to navigation Jump to search

A Gated Recurrent Neural Network is a recurrent neural network that contains one or more hidden layers of GRUs. .



References

2018


2017

2016

2014a

2014b

2014c

Figure 1: Illustration of (a) LSTM and (b) gated recurrent units. (a) [math]\displaystyle{ i }[/math], [math]\displaystyle{ f }[/math] and [math]\displaystyle{ o }[/math] are the input, forget and output gates, respectively. [math]\displaystyle{ c }[/math] and [math]\displaystyle{ \tilde{c} }[/math] denote the memory cell and the new memory cell content. (b) [math]\displaystyle{ r }[/math] and [math]\displaystyle{ z }[/math] are the reset and update gates, and [math]\displaystyle{ h }[/math] and [math]\displaystyle{ \tilde{h} }[/math] are the activation and the candidate activation.