Golden Recursion Inc. logoGolden Recursion Inc. logo
Advanced Search
Vanishing gradient problem

Vanishing gradient problem

The vanishing gradient problem can occur when training neural networks using gradient descent with backpropagation. When the derivative of the activation function tends to be very close to zero, the gradient used to updated the weights of the network may be too small for effective learning.

Wikidata ID
Golden logo
By using this site, you agree to our Terms & Conditions.