Log in
Enquire now
‌

Gradient descent

A first-order optimization algorithm

OverviewStructured DataIssuesContributors

Contents

Other attributes

Wikidata ID
Q1199743

Gradient descent is a first-order iterative optimization algorithm for finding a local minimum of a differentiable function. The idea is to take repeated steps in the opposite direction of the gradient (or approximate gradient) of the function at the current point, because this is the direction of steepest descent. Conversely, stepping in the direction of the gradient will lead to a local maximum of that function; the procedure is then known as gradient ascent.

Timeline

No Timeline data yet.

Further Resources

Title
Author
Link
Type
Date

Convergence and efficiency of subgradient methods for quasiconvex minimization

Krzysztof C. Kiwiel

https://link.springer.com/article/10.1007%2FPL00011414

Academic paper

Momentum and Learning Rate Adaptation

http://www.willamette.edu/~gorr/classes/cs449/momrate.html

References

Find more entities like Gradient descent

Use the Golden Query Tool to find similar entities by any field in the Knowledge Graph, including industry, location, and more.
Open Query Tool
Access by API
Golden Query Tool
Golden logo

Company

  • Home
  • Press & Media
  • Blog
  • Careers
  • WE'RE HIRING

Products

  • Knowledge Graph
  • Query Tool
  • Data Requests
  • Knowledge Storage
  • API
  • Pricing
  • Enterprise
  • ChatGPT Plugin

Legal

  • Terms of Service
  • Enterprise Terms of Service
  • Privacy Policy

Help

  • Help center
  • API Documentation
  • Contact Us
By using this site, you agree to our Terms of Service.