Exploding Gradient Problem

From GM-RKB
Jump to navigation Jump to search

An Exploding Gradient Problem is a Neural Network Training Algorithm problem that arises when using gradient descent and backpropagation.



References

2021

2017

Grosse2017 RNN.png
Figure 3 shows the function computed at each time step, as well as the function computed by the network as a whole. From this figure, you can see which regions have exploding or vanishing gradients.

Grosse2017 FIG3.png
Figure 3: (left) The function computed by the RNN at each time step, (right) the function computed by the network.