Member-only story
Opinion
Algorithms are not enough
The next breakthrough in AI requires a rethinking of our hardware
Today’s AI has a problem: it is expensive. Training Resnet-152, a modern computer vision model, is estimated to cost around 10 Billion floating point operations, which is dwarfed by modern language models. Training GPT-3, the recent natural language model from OpenAI, is estimated to cost 300 Billion Trillion floating point operations, which costs at least $5M on commercial GPUs. Compare this to the human brain, which can recognize faces, answer questions, and drive cars with as little as a banana and a cup of coffee.
How did we get here?
We’ve come a long way. The first computers were special-purpose machines. In 1822, British Mathematician Charles Babbage created the ‘difference engine’, which had the sole purpose of calculating polynomial functions. In 1958, Cornell professor Frank Rosenblatt created ‘Mark I’, a physical incarnation of a single-layer perceptron for machine vision tasks. Hardware and algorithm, in these early days, were one and the same.
The unity of hardware and algorithm changed with the introduction of the von-Neumann architecture, a chip design consisting of a processing unit for computation, and a memory unit for storing data and program…