History (and future) of Computing in One Chart

Pushkar Ranade
3 min readOct 15, 2015

--

In a conversation with The Atlantic in 2013, outgoing Intel CEO Paul Otellini shared his thoughts on the evolution of the semiconductor industry. (Link) During the interview, Paul sketched out the “history of the computer industry in one chart.” For nearly four decades, advances in computing have driven the growth of the semiconductor industry in general and Intel in particular.

Indeed, this one chart succinctly illustrates how the increasing ubiquity and pervasiveness of computing has driven down the cost structure of the semiconductor industry while also shaping the evolution of semiconductor technology over several decades. Otellini noted that an aggressive drop in the price per unit has enabled the equally exponential increase in the number of computing units sold over time.

Starting from mainframes tsinhat cost tens of thousands of dollars each and shipped only in thousands of units per year, the industry evolved to shipping well over 300 million PC units at less than $1000 per unit. Smartphones and tablets are on track to ship well over 3 billion units at less than $100 per unit. As is evident in market trends, we are on the cusp of a new wave that will further proliferate the ubiquity of computing by shipping over 30 billion units at less than $10 per unit (see figure below).

Each transition in computing was enabled by a re-architecting of the underlying silicon platform.

Known as the Internet of Things (IoT), this new wave of computing will be powered by ultra-cheap, ultra-low-power, and highly integrated chips connected to a cloud or a network and embedded in virtually every physical object around us. Many analysts expect this market to range well over 20 billion units by the end of the decade.(Link)

To predict how the semiconductor industry will evolve in this new computing epoch, it is instructive to understand the evolution of semiconductor technology over prior computing transitions. With every prior transition, the industry re-architected the silicon platform to meet the requirements of the new computing wave. Companies that embraced the new architecture early were able to support the aggressive reduction in price point and establish leadership. On the other hand, incumbents that failed to quickly adapt lost ground to disruptors and struggled to recover. This observation also holds true in the software industry, where a re-architecting of the operating system (OS) and application software has enabled each successive wave of computing (see table below).

During prior computing transitions, the re-architecting of silicon platforms was accompanied and indeed enabled by the geometrical scaling of transistor technology which provided higher performance and lower power at successively lower cost points. There are indications that the historical cost/power/performance trajectories may not continue over the coming transition as geometrical transistor scaling slows down and cost-per-transistor plateaus or even inches higher.

For the first time in the history of the semiconductor industry, the next wave of computing will likely not be driven by the most advanced transistor node, but instead by lagging transistor nodes. Over time, the driving metric will transition from “Performance-per-watt” to “Energy-per-operation” and chip- and system-level integration will become the key enablers.

To enable the age of ubiquitous computing and the Internet of Things, the semiconductor industry will adapt again around a new platform to support a $10 unit price point. The next post will discuss how the industry adapted to prior waves of computing and how the current transition to ubiquitous computing is likely to play out within the global semiconductor industry.

--

--

Pushkar Ranade

I write about the semiconductor landscape and its evolution in the next wave of computing