Golden
Vanishing gradient problem

Vanishing gradient problem

The vanishing gradient problem can occur when training neural networks using gradient descent with backpropagation. When the derivative of the activation function tends to be very close to zero, the gradient used to updated the weights of the network may be too small for effective learning.

No suggestions for this topic are available at this time.
Head over here to see suggestions for all topics on Golden.