Golden
Vanishing gradient problem

Vanishing gradient problem

The vanishing gradient problem can occur when training neural networks using gradient descent with backpropagation. When the derivative of the activation function tends to be very close to zero, the gradient used to updated the weights of the network may be too small for effective learning.

Currently, there are no news stories for this topic.