Exploding Gradient Problem: Difference between revisions

From GM-RKB
Jump to navigation Jump to search
m (Text replacement - ".<P>" to ". <P>")
m (Text replacement - " " to " ")
Line 7: Line 7:
** …
** …
* <B>Counter-Example(s):</B>
* <B>Counter-Example(s):</B>
** [[Vanishing Gradient Problem]].
** [[Vanishing Gradient Problem]].
* <B>See:</B> [[Recurrent Neural Network]], [[Deep Neural Network Training Algorithm]].
* <B>See:</B> [[Recurrent Neural Network]], [[Deep Neural Network Training Algorithm]].



Revision as of 07:55, 19 June 2023

An Exploding Gradient Problem is a Neural Network Training Algorithm problem that arises when using gradient descent and backpropagation.



References

2021

2017

Grosse2017 RNN.png
Figure 3 shows the function computed at each time step, as well as the function computed by the network as a whole. From this figure, you can see which regions have exploding or vanishing gradients.

Grosse2017 FIG3.png
Figure 3: (left) The function computed by the RNN at each time step, (right) the function computed by the network.