The Computer Revolution Has Yet to Happen


A combination of this “carry anywhere” device and a global information utility such as the ARPA network or two-way cable TV, will bring the libraries and schools (not to mention stores and billboards) of the world to the home. One can imagine one of the first programs an owner will write is a filter to eliminate advertising!

~ Alan Kay A Personal Computer for Children of All Ages (1972) [1]

Blocking ad blockers on iOS

Alan Kay quietly helped invent the future through sound reasoning, a strong philosophy, and unabashed dreaming. He worked as a top researcher at institutions such as Xerox PARC, Atari, Apple, Disney, HP, and Kyoto University, resulting in contributions that range from object-oriented programming to child learning tools, where he continues his research today.

The fallout from Apple’s decision to allow ad blocking software on their iOS devices made me think back to Kay’s 1972 spot-on description of the world to come. When describing these “carry anywhere” devices, he only seemed to miss the mark on the idea that a tablet or phone owner would ever write software that could eliminate advertising.

This small point, a tablet owner writing their own software, makes a big difference. Today’s most sought-after and “innovative” digital devices are mostly conceived as advanced telegraphs (mobile phones) and interactive paper (tablets), not the next incarnation of the personal computer. To understand why I think that’s true, we need to take a step back and look at how we got here.

The Revolution That Never Was

Commodore 64

We’re living in a time that allows us to observe the very first steps of a digitally native generation. Parents are often caught marveling at their extremely young child’s grasp of touch screen electronics. It’s an implicit endorsement of today’s simple user paradigm.

My digital experience as a 5 year old in the 1980s was radically different. Our TI 99/4a booted straight into a programming language called BASIC. “Home computing,” as it was commonly referred to at the time, was advertised as a system that helped mom balance the checkbook and allowed dad to dial in and check the baseball scores. The really savvy companies, like Commodore and Atari, also sold to the children - video games were the gateway that mom and dad could use to trick their children into learning computer programming — the language of the future.

Some QBasic code I wrote when I was about 14. It was a database of the video games I owned.

I know how to program, so I suppose I fulfilled the promise of home computing. But not everybody was going to learn BASIC. Even the most optimistic computer scientists considered BASIC a stepping stone to a better, and more expressive, tool.

The Killer Application

Alan Kay’s Dynabook (L) & Steve Jobs’ iPad (R)

Alan Kay’s “Dynabook,” described in detail in 1977, is often cited as the antecedent to today’s tablets because of the device’s form factor and media-rich environment.

But looking at the Dynabook’s style misses the substantive philosophical underpinnings of the device. For example, it was essential that the Dynabook easily allowed children to play collaboratively — especially games that they made themselves.

That last part illustrates what today’s tablet is not. When computers booted into BASIC and it looked as if children were going to grow up knowing how to program, computing promised to become a unique vector of creativity for the masses. A new generation would grow into an adulthood where creating software was as common as telling stories and balancing their checkbook

Kay knew that languages like BASIC were too limited to do the job. First of all, there were design limitations that encouraged the user to build every program from scratch. Secondly, it was difficult to conceptualize and model real world problems in the digital world. He conceived of “object oriented” software that would easily allow reusable components to exchange messages with one another. Simple applications could then be created by the way objects interact.

Kay’s ideas had a huge impact, but not in the way he had hoped. For example, I have a half dozen ways to apply photo filters on my iPhone. None of these components talk to one another and, outside of some operating system-level libraries, they were all built from the ground up by highly paid engineers at Twitter, Instagram, Apple, etc….

The dream of Alan Kay’s Dynabook isn’t in the design of the hardware, it’s in the creation of an elegant tool set that enables a user to easily express themselves in the domain of computing — building games, interactive stories, virtual robots, and applications.

The Revolution

Playing an instrument or singing gives the performer just a little more insight into the underpinnings of music. Writing takes the author one step closer to understanding storytelling and communication. But using a tablet gets the user no closer to understanding the essence of computing.

The computer has provided us a low-cost music recording and distribution platform. It has built an entire network of storytelling and communication tools. These developments have been nothing short of revolutionary. But the computer revolution will happen when the medium of computing itself becomes accessible to more than a minority of specialists.

Marshall McLuhan & Quentin Fiore “The Medium is the Massage” (1967)

The people that will upend Apple are going to the ones that re-engage the principles that allowed Apple to upend IBM and DEC in the 1980s. It’s all about user empowerment and novel approaches to creativity and thought. However you feel about the ad blocking issue, Apple’s unilateral policy making on the issue shows how far we have to go.

Computing is a medium still waiting to be unlocked.


There is much more to this story! Check out Jack and the Machine, an interactive documentary that parallels the development of the PC with the life and times of holocaust survivor Jack Tramiel, the founder of Commodore Business Machines and the man who did more than anybody else to build a “computer for the masses, not the classes.”


[1] Originally published in the Proceedings of the ACM National Conference, Boston (August 1972)