Edge computing is a form of computing for optimizing the performance of web applications and internet connected devices by completing computational processes closer to the source of data. Edge computing reduces bandwidth consumption and latency by reducing the distance of communication between clients and servers. Edge computing acts to decentralize the processes of cloud computing centers because it changes the geographic distribution of where computation is taking place.
The main advantage of edge computing from the point of users are the reduction in latency times edge computing can provide compared to cloud computing. Edge computing allows users to gather the information they need to run their code on the internet, typically using a web browser, by getting the information stored on a edge computing network that is closer to them than cloud computing and other related internet infrastructure. The reduction in latency offers consumers the benefit of accessing what they want faster and providers benefit by saving on server costs because the computations are done more locally using edge computing compared to cloud computing.
Edge computing companies
Timeline
People
Further reading
How AI Accelerators Are Changing The Face Of Edge Computing
Janakiram MSV
Web
July 15, 2019
What is edge computing
Cloudflare
Web
What is edge computing?
Paul Miller
Web
May 7, 2018
What is Edge Computing? | GE Digital
GE Digital
Web
Documentaries, videos and podcasts
Edge Computing: It's not just a bunch of small clouds
September 24, 2018
HPE CEO: Edge computing to be bigger than cloud computing
February 22, 2019
IoT at the Edge: Bringing intelligence to the edge using Cloud IoT (Cloud Next '18)
August 17, 2018
IOT Edge Computing | IoT Examples | Use Cases | HackerEarth Webinar
September 5, 2017
Keynote: Beyond the Cloud: Edge Computing - Mark Skarpness
October 25, 2017