Mining the Past for Future Hardware
The future is full of computers that will talk to us, distributed networks, and AI-on-a-chip. All of this has been done before, and nearly every technological advance coming up in the next few years has been sold decades ago.
At the Our Networks conference, I stumbled through a workshop deploying mesh networks with One Laptop Per Child computers. We clicked through the GUI to get the little computers to talk to one another, and soon enough it just worked.
I’ve always heard how the OLPC project was a massive failure, so I was surprised to see a failed hardware product from 15 years ago being used to create mesh networks.
Very often, the Next Great Technology that will save us has already been produced, and the evidence of this can be found in landfills.
The workshop was led by Libi Rose Striegl, lab manager from the Media Archaeology Lab at the University of Colorado at Boulder. Reviving a piece of hardware we’ve come to underestimate is part of their approach to education. The lab keeps a collection of functioning obsolete media, from early personal computers to magic lanterns and typewriters because they say: “the past must be lived so that the present can be seen.”
Today, we have products like Gotenna that allow any two smartphones to speak to one another without a cellular connection. The WiFi Alliance is slowly rolling out their version of mesh networking, but it is important to look at who has shipped devices with these functions before. The more we look, the more the future is always a reprise on the past.
After the conference, my curiosity was piqued and Libi took me on a video tour of the lab in Colorado, where they house everything from early desktops to obscure media storage devices.
Everything is maintained by the staff and hobbyist volunteers and kept in working order, so you can tinker and play with all of the technology. Alongside Apple II’s and other popular computers we might recognize from our younger days, the lab has unique items that most of us have never owned.
A real gem is the lab’s ZX Spectrum clone which was made by hand by a man who was employed by the state-run chip manufacturer in Romania. He pilfered the parts gradually from the factory, fabricated his own PCB, and assembled in secret. He even added a pseudo-Spectrum logo.
There is also a collection of what they’ve come to call apocryphal technologies: these include polygraph tests and the Scientology e-meter. Artist Jamie Allen did a residency at the lab tearing down these pieces and showing what’s happening behind the smoke and mirrors.
But what can engineers today learn from the history of hardware? Like an evolutionary tree that shows us our failed ancestors, Striegl pointed out that the history of computing has branches that just died out. It’s not always clear why one approach flourished while others failed. Sometimes it was the right device at the wrong time, or an exciting, technology was released when another branch was dying.
Sometimes corporate monopolies simply overruled ideas that weren’t part of their ecosystem. Commodore died as a company, despite being more flexible and usable than its competitors. They didn’t manage to keep the market share, and the world moved on. Going back to these points where a branch ends can reveal ideas and approaches that are relevant today but didn’t gain traction when first introduced.
The Canon Cat was a failure because it was a computer that was task dedicated for work, which Striegl characterizes as an embedded misconception about the way people were going to use computers. But today task-dedicated hardware is taking off, ironically as a response the way work has pervaded computing. Hardware that can’t get online, like the Light Phone is increasingly popular, and there is a niche community arising around portable word processors that block out all distractions by virtue of having none available.
One of the major changes we’ve seen in hardware design is a shift from modularity to black-box solutions. The Apple II is a favorite computer to work with because you can simply open the cover to get inside and swap out parts. The design philosophy at Apple has certainly changed a lot in the intervening 35 years.
There are certainly challenges to reintroducing modularity when hardware has become so miniaturized, as the demise of Google’s Ara project — an idea to develop a modular smartphon — demonstrates, but Striegl sees this trend as dumbing down computing, robbing consumer of the opportunity to understand what their hardware is doing. Designers and engineers can revisit a time when hardware was more durable, repairable, and modular. This could teach us a lot about how to make products that end up meaning a lot to the people who use them.
Commodore, Sinclair Research, and Atari, might have failed commercially, but the long tail of the market lives on with enthusiasts all over the world keeping the computers running.
The Media Archaeology Lab offers an opportunity to understand the history of computing, discover lost gems of the past, and inspire new approaches to hardware that take cues from values that were left behind. If you find yourself in Boulder, stop by and have a look, but if you can’t make it out you can always explore with a text adventure game.