Log in
Enquire now
Hogwild (machine learning)

Hogwild (machine learning)

An approach introduced to run Stochastic gradient descent in parallel without locks

OverviewStructured DataIssuesContributors

Contents

TimelineTable: Further ResourcesReferences

Hogwild is scheme developed to run in parallel with the Stochastic gradient descent (SGD). It is a strategy to eliminate overhead correlated with synchronization and locking when running with SGD.

In Hogwild, the processors are allowed equal access to shared memory and can update individual components of memory . Threads are allowed to overwrite one another and gradients are computed on stale versions of the current solution.

Timeline

No Timeline data yet.

Further Resources

Title
Author
Link
Type
Date

Hogwild for Machine Learning on Multicore

https://www.youtube.com/watch?v=l5JqUvTdZts

11 June 2014

Hogwild!: A Lock-Free Approach to Parallelizing Stochastic Gradient Descent

Feng Niu, Benjamin Recht, Christopher Re and Stephen J. Wright

https://arxiv.org/pdf/1106.5730.pdf

Academic paper

References

Find more entities like Hogwild (machine learning)

Use the Golden Query Tool to find similar entities by any field in the Knowledge Graph, including industry, location, and more.
Open Query Tool
Access by API
Golden Query Tool
Golden logo

Company

  • Home
  • Press & Media
  • Blog
  • Careers
  • WE'RE HIRING

Products

  • Knowledge Graph
  • Query Tool
  • Data Requests
  • Knowledge Storage
  • API
  • Pricing
  • Enterprise
  • ChatGPT Plugin

Legal

  • Terms of Service
  • Enterprise Terms of Service
  • Privacy Policy

Help

  • Help center
  • API Documentation
  • Contact Us
By using this site, you agree to our Terms of Service.