Back to the Edge: AI Will Force Distributed Intelligence Everywhere

azeem
12 min readJul 26, 2017
A visualization of AI processes from AI startup Graphcore, mentioned below.

In part one we explored how artificial intelligence was ramping up the demand for computing cycles.

In this installment we’ll explore how the demands of AI will drive two shifts: the resurgence of the processing on the edge, and the arrival of new processing architectures.

Produced in partnership with NewCo Shift.

Cloud will flourish, edge will bloom

In late 1996, George Favaloro and Sean O’Sullivan, two executives from Compaq, realized that the presence of ubiquitous Internet connections to computers would change where information processing could take place. Rather than occurring in office server rooms or on the desktop, computing processes could start to take place to servers accessed over the Internet. This shift in the locus of computation they called the ‘cloud’. The term didn’t stick (back then, anyway), and Compaq was swallowed by HP in 2002.

But of course, the theme they identified took root. In 2006 Google’s then-chief executive, Eric Schmidt, said: “I don’t think people have really understood how big this opportunity really is. It starts with the premise that the data services and architecture should be on servers. We call it cloud computing — they should be in a “cloud” somewhere.”

--

--

azeem

Entrepreneur, inventor and creator — curator of The Exponential View