The exploding gradient problem is a difficulty which can occur when training artificial neural networks using gradient descent by backpropagation. When large error gradients accumulate the model may become unstable and impair effective learning.
Currently, there are no issues on this topic.