Member-only story

Opinion

Algorithms are not enough

The next breakthrough in AI requires a rethinking of our hardware

Samuel Flender
TDS Archive
Published in
4 min readOct 6, 2020

--

Photo by Umberto on Unsplash

Today’s AI has a problem: it is expensive. Training Resnet-152, a modern computer vision model, is estimated to cost around 10 Billion floating point operations, which is dwarfed by modern language models. Training GPT-3, the recent natural language model from OpenAI, is estimated to cost 300 Billion Trillion floating point operations, which costs at least $5M on commercial GPUs. Compare this to the human brain, which can recognize faces, answer questions, and drive cars with as little as a banana and a cup of coffee.

How did we get here?

We’ve come a long way. The first computers were special-purpose machines. In 1822, British Mathematician Charles Babbage created the ‘difference engine’, which had the sole purpose of calculating polynomial functions. In 1958, Cornell professor Frank Rosenblatt created ‘Mark I’, a physical incarnation of a single-layer perceptron for machine vision tasks. Hardware and algorithm, in these early days, were one and the same.

The unity of hardware and algorithm changed with the introduction of the von-Neumann architecture, a chip design consisting of a processing unit for computation, and a memory unit for storing data and program…

--

--

TDS Archive
TDS Archive

Published in TDS Archive

An archive of data science, data analytics, data engineering, machine learning, and artificial intelligence writing from the former Towards Data Science Medium publication.

Samuel Flender
Samuel Flender

Written by Samuel Flender

Cutting-edge ML research, simplified. For the latest, subscribe to my Newsletter: mlfrontiers.substack.com

Responses (7)