Networks Without Networks
Over the last few days I’ve been crazy for emulation—that is, simulating old, busted computers on my sweet modern laptop. I’ve been booting up fake machines and tearing them down, one after the other, and not doing much besides. Machines I’ve only heard of, arcade games I never played, and programs I never used. Software about which I was always curious. And old favorites like MacWrite.
Hour after hour, this terrible fever. What the hell am I doing? I kept asking myself. Why am I forcing a fine new machine to pretend it is a half-dozen old, useless machines?
Eventually I realized: This might be about my friend Tom dying. At least I think so. I am not good at identifying my own motives. It usually takes me at least ten days and a number of snacks to go from feeling something to being able to articulate what I felt. Indeed, I got the news ten days ago, in an email from my friend Jim.
“Really sad news”
“Really sad news” was the subject. Tom died at 73, after an illness. Here is a picture of him from 1999. He is the one on your left.
Imagine having, in your confused adolescence, the friendship of an older, avuncular man who is into computers, a world-traveling photographer who would occasionally head out to, like, videotape the Dalai Lama for a few weeks, then come back and listen to every word you said while you sat on his porch. A generous, kind person who spoke openly about love and faith and treated people with respect.
We had fallen out of touch.
It was good to have known him.
The Amiga 1000
(What is it good for?)
I always knew Tom. He rented a room from my grandparents. When I was 12, my parents succumbed to my begging and bought me an Amiga computer. By coincidence Tom had one too. Amigas were in the air because we lived near its manufacturer, Commodore computer, in Pennsylvania.
The Amiga looked like this:
And it was oddly good at animating things.
But the Amiga had a problem. The IBM PC was for business; you used it to track stocks and type up reports. The Apple Macintosh was for fancy business, for work done in art galleries or loft apartments. You might use it to publish a newsletter for gourmets who were also physicists.
And the Amiga was for…well. It was originally conceived as a videogame console, then the game industry faltered—this was in 1984, when Atari had produced so many excess videogames that it had to bury them in the desert to get rid of them. Commodore bought the Amiga designs in the hopes of competing with the Macintosh.
But Commodore was best known for its “bitty boxes,” cheap, popular machines like the VIC-20 and Commodore 64 that sold at Sears. Could it compete?
The Amiga launch event was held in 1985 at Lincoln Center in New York City. A tall man named Robert Pariseau (head of software) emceed, in tuxedo and tremendous ponytail. They enlisted the Amiga to make pie charts, forced it to speak and “multi-task,” and made it become an IBM PC to run a spreadsheet.
To conclude the night Andy Warhol, in his wig and brightly-colored glasses, came on stage along with Debbie Harry. He used the Amiga to snap a photo of Debbie Harry’s face and began to manipulate it live, using a mouse. Debbie Harry sat passively with her eternal pout, but Warhol had fun messing with her hair on the screen. This was a mistake, because both Debbie Harry and Andy Warhol were almost obscenely beautiful. The lovely little machine, juxtaposed with two people who actively epitomized sophistication, couldn’t hold its own. The whole thing just seems weird.
That was the launch. Now they had to sell it to the masses. Here Commodore transformed confusion into bafflement.
“As our TV screen is filled with the computer screen on which appears a wide-eyed fetus,” wrote the New York Times in 1985, describing the first Amiga commercial, “the voiceover delivers practically its only line in the 60-second commercial: ‘Re-experience the mind unbounded.’”
No one knew what they were doing, so they retreated to gibberish. But it never got better. Consider this video from 1987. Take the two-and-a-half minutes to watch it. Let it inside. Be with me in 1987.
So. That’s the Amiga. It found niches—it was big in Europe, a favorite of hackers and programmers alike; it was beloved of video producers like my friend Tom. But it never became a true global platform. Microsoft Windows 3 came out in 1990, the beginning of a barely-challenged 20-year ascendancy; Commodore was out of business by 1994.
Like all also-ran underdogs the Amiga inspired a maniacal affection in its users that took decades to exhaust. Here’s an Amiga user in 2000 or later (his screenshot of “OS 3.9” can be used to date the video). Note that he is singing the same song from the 1987 video.
It was fun while it lasted.
In 1987 my father and I went to the Amiga users’ group meetings in nearby Downingtown. These were held in a basement of a computer store with wood paneling. At the users’ group you could buy floppy disks for a few bucks, and on them would be items downloaded from local bulletin board systems. Hardly anyone had modems, so this was how files were transmitted. Tom would be at the user group meeting sometimes. Or he’d pick me up and drive me over if my father was busy.
This is how a network comes together. You bought something and then you wanted to understand it, so you went out and found other people. You found them via posters in hallways, or word of mouth, or by purchasing a magazine that caught your eye and then reading the ads in the back.
You’d go to a party and browse through the host’s record collection, chat about the album, and maybe decide to go see a concert together—or in some cases you’d start a band.
Another example: Steve Wozniak built the Apple I computer because he knew the people at the Homebrew Computer Club would think it was cool. He wanted to blow their minds, and he did. A lot of times when people talk about Apple, Inc.—one of the largest social and corporate structures in the world, larger than many governments—they talk about design, manufacturing, and vertical integration. But the main driver for Apple’s early excellence was that Wozniak wanted to look cool in his little nerd network. He’d show his work to friends and they’d show him what they were working on. Without that, nothing that followed.
Commodore considered buying Apple back when Apple was in a garage. Steve Jobs was interested in selling. It fell through.
A year after the Amiga showed up—I was 13—my life started to go backwards. Not forever, just for a while. My dad left, money was tight. My clothes were the ones my dad left behind, old blouse-like Oxfords in the days of Hobie Cat surfwear. I was already big and weird, and now I was something else. I think my slide perplexed my peers; if anything they bullied me less. I heard them murmuring as I wandered down the hall.
I was a ghost and I had haunts: I vanished into the computer. I had that box of BBS floppies. One after another I’d insert them into the computer and examine every file, thousands of files all told. That was how I pieced together the world. Second-hand books and BBS disks and trips to the library. I felt very alone but I’ve since learned that it was a normal American childhood, one millions of people experienced.
Often—how often I don’t remember—I’d go over to Tom’s. I’d share my techniques for rotating text in Deluxe Paint, show him what I’d gleaned from my disks. He always had a few spare computers around for generating title sequences in videos, and later for editing, and he’d let me practice with his videocameras. And he would listen to me.
Like I said: Avuncular. He wasn’t a father figure. Or a mother figure. He was just a kind ear when I needed as many kind ears as I could find. I don’t remember what I said; I just remember being heard. That’s the secret to building a network. People want to be heard. God, life, history, science, books, computers. The regular conversations of anxious kids. His students would show up, impossibly sophisticated 19-year-old men and women, and I’d listen to them talk as the sun went down. For years. A world passed over that porch and I got to watch and participate even though I was still a boy.
I constantly apologized for being there, for being so young and probably annoying, and people would just laugh at me. But no one put me in my place. People touched me, hugged me, told me about books to read and movies to watch. I was not a ghost.
When I graduated from high school I went by to sit on the porch and Tom gave me a little brown teddy bear. You need to remember, he said, to be a kid. To stay in touch with that part of yourself.
I did not do this.
General Instructions on
How to Emulate
Emulating is a nerdy hobby that takes an enormous amount of time. If you enjoy reading manuals for spreadsheet programs from 1983, you’ll love software emulation. (If your eyes glaze over at the thought, just scroll along your way.)
You typically need four things to emulate an old computer:
- The emulator software. This lets your computer pretend it is a different kind of computer. It can range from commercial tools like VMWare Fusion which allows you to emulate a Windows PC on a Mac, to things like MAME, which pretends to be every kind of arcade machine, or VICE, which emulates the early Commodore computers. You can also buy emulators, like Amiga Forever or C64 Forever. Buying things means it’s all done for you and you can ignore the steps that follow.
- The ROM files. There’s a liminal kind of software called the BIOS, for Basic Input/Output System. This is the nervous system of a computer; it’s what’s already installed even before a computer starts to load its operating system. For most systems some enterprising nerd has pulled the ROMS out of hardware and given them a name like KXK1CFJ.ROM. These files are almost always copyrighted, so to find them you have to Google around for things like “mac plus ROM” and wade through a lot of weird hedging language to find what you need. Just look for phrases like: “You cannot download this file unless you own a ColecoVision Model X Grobbler Frog Controller” followed by a big blue link to the file you cannot download, that you must never download. The entire world of emulation is filled with references to very specific things that you should not seek out, that you must never Google, that you should definitely not obtain.
- An operating system. Once you have the emulator and the ROM it’s like you actually own a new, old, computer—but it lacks for an operating system. Want to experience System 6.08 for your Mac? Workbench 2 for the Amiga? Microsoft DOS 6.22? You’ll likely make a fake hard drive. Then you actually install the real, authentic operating system onto the fake hard drive. Sometimes you will need to “insert” fake “floppy disks” into the fake “floppy drive” in order to install the real operating system onto the fake “hard drive” on the fake “computer.” (This is accomplished by clicking buttons.) Then you’ll “reboot.” It’s all very weird.
- Software. You might luck out and find a virtual hard drive pre-loaded with hundreds of applications; then you can download that whole bad boy and just coast. I’ve got one for the Mac, it’s 542 megabytes of joy. Want to use Photoshop 1.0 in black and white with German-language menus? No? Well, I do. More likely you will need to download virtual disks. You can find these by searching around for the word “abandonware” plus the name of the operating system you like. Sometimes you will find lovingly tended sites like Macintosh Garden. There are also the TOSEC collections, which have tens of thousands of archived computer programs to choose from; just about every Amiga program is available. In general, abandonware websites are badly categorized nightmares that require you to click five affiliate links to download a 20 kilobyte DOS file—or hyper-categorized massive sets of tens of thousands of disks created by obsessive completists. Either way, whoa.
The world of retro-computing is scattered, chaotic, murky, and legally suspect—although major progress is being made by the Internet Archive, among other organizations, at bringing old software into the light. To my knowledge, no one has ever been prosecuted for downloading twenty-year-old word processing software.
Last week my friend Jim emailed:
And all the Amiga memories. Man oh man. We’d trade equipment and software. He had a name for it: let’s “play ‘puters” he’d say. That’s Tom too. We were always hitting each other up for software. Wrote many a long serial number down for him.
In 2002, Jim and Tom and I got together and went down to an Amiga festival at a hotel in Maryland. It was—even by the standards of nerd events—well, it was rough. Men had Amiga logos woven into their beards. People with ailments sold disks out of worn cardboard boxes. I had expected it to be like an alumni weekend, a chance to get together and chat about old times. But these people were angry. I remember driving back and feeling stupefied. How could all that sweetness have leached from the world? I blamed Microsoft Windows.
But that was wrong. In truth, there was nothing to blame. Companies come and companies go and things turn out differently than you’d hope.
That’s the last long stretch of time I spent with Tom.
I don’t know why I drifted. He never took to email. I wanted distance from my family, from my childhood. I still know his phone number by heart. At least once a month I’d think of calling. Of going down for a visit.
We kept very loose tabs on each other through our mutual friend Jim. Using that oldest of networks, people talking about each other.
My Week of
Here is a late-1970s vintage
Xerox Alto running Bravo
The machine was a Xerox Alto. The word processor was called Bravo. The software is from 1976 or a year or two later—hard to tell—although the quote I typed in is from Steve Jobs in 2005. The Alto had a command line and no icons but used a mouse with three buttons. It also featured drawing programs and games and was typically plugged into a network. It was created to serve the needs of a research community, to bind them together and give them a common language with which to express ideas about technology. It cost more than a house.
Here is an early version of
MacOS running on a
Mac Plus, circa 1986
The Mac did not invent that much—but do we criticize Giorgio Armani for not inventing the suit? It turned the inside of the computer into a place with warm little windows. It was expensive and a little snobby — like a nice mint-green polo shirt with a little black alligator embossed above your heart. It saw mass, popular computing not as a set of commands, but as an ongoing, continual experience. That people are so eager to share that experience, how urgent and real it can feel to them, is why Apple is so unbelievably huge today.
Here is a Macintosh Plus
running Smalltalk-80, 1987
This is a Mac, like the Mac above, but it’s running Smalltalk-80. Smalltalk was a product of the Xerox Alto culture, and was created along with the Alto. It was where many of the current ideas that are prevalent in computing — object-oriented coding, windowing systems, and graphics — were first refined into usable software products.
When the Mac people went over to study what Xerox was doing, they were copying Smalltalk ideas. (Adele Goldberg, one of the co-creators of Smalltalk, refused to show Steve Jobs the system, until her bosses gave express permission. Which they did. Apple was paying to see.)
Smalltalk-80 is a kind of programming language but you don’t run programs independently; rather, you open up large-ish “image files” that are themselves a kind of virtual machine — so here we are emulating a Mac and then running another fake computer, in the form of the Smalltalk virtual machine, atop it.
In 2014, Smalltalk is an idea that keeps going, in the form of a programming environment called Squeak, and in other versions, too. The idea of the Mac keeps going too. The Alto keeps going in the form of windows and good fonts. This is important to me, that sense of continuity. The typical story of technology is one of progress; your floppies get old and decrepit and you can’t see your old data, that’s basically your fault, and who wants to live in the past? But human networks often stick around for decades, half-centuries. People have been working on Smalltalk for more than 40 years, for as long as I have been alive. Just continually thinking about it, how to improve it, how to make it popular, how to get the world to acknowledge it. It binds them together. I respect that.
Here is an Amiga running TextCraft, its first word processor, which cost $99.95 in 1987
This program taught me to write and think in paragraphs. I spent hours here, sorting my young thoughts. Even back in 1987 we knew this program was an ugly disaster. Note the use of a little mucilage jar for…pasting. Programs like WordStar or WordPerfect were much, much better, but they only ran on MS-DOS back then. Or more obscure operating systems like CP/M. So we worked with what we had and we talked about it and made do.
Here is a recent installation of Plan 9,
a descendant of Unix created in the 1990s
Plan 9 is a strange one. Here it is running its window manager, Rio. (Its logo is a rabbit named Glenda.) The Acme text editor, shown above, is a major part of the Plan 9 OS and is a whole world unto itself. Everything in Acme, including the menus, is pure, editable text. This seems very light and easy but the more you think about it the weirder it gets.
I don’t know exactly why I ran this operating system during my binge. It came around in the 1990s as a possible successor to Unix. It did not become the successor to Unix, but the ideas within it are reinvented, in a debased and half-considered form, about once an hour in the open source community.
Here is an Emulated LISP Machine
running OpenGenera from the 90s
The reason I was running a Lisp machine is that it represents this very specific vision of technology, where computers were deeply powerful and infinitely customizable and incredibly easy to manipulate.
LISP is a computer language. But for a period in the 1980s there were Lisp machines: A computer that was all in a single language, from its weirdest inner parts to its windows and mouse cursor. Everything unified, pure, and open to inspection and manipulation.
What you see above is not a Lisp machine per se, but a Lisp machine simulator designed to run atop a Unix system—like Smalltalk atop the Mac.
It is a very weird experience. It feels like a machine for monks or nuns. Baffling. But there is this weird sense of raw power, like you have been handed the keys to a nuclear-powered submarine. It might take you a few months or years to learn the mysteries. That’s fine. LISP won’t change.
Smalltalk was deeply inspired by the LISP language. Everything was deeply inspired by LISP, because it’s so fundamental. People either learned it, and were inspired, or refused to learn it, and reinvented it in half-assed form.
Here is a Windows 3.1 Machine
Running Microsoft Paint, circa 1992
This is Windows. It is a layer above an operating system called MS-DOS. It was made by a company in Seattle. It changed the world economy by being all things to all people. You can no longer be all things to all people when it comes to computers, but Microsoft keeps trying. Windows is an accurate representation of what people expect from computers, which on one hand is fascinating and the other is a tragedy.
It really worked for tens of millions of people and changed their computing lives. And there was some wonderful software that resulted. That said: Windows is the Superbowl Halftime Show of operating systems. Given what everyone got paid, and how many people were involved, you’d think it would be more memorable.
Here is a NeXT OpenStep Environment
Running Interface Builder, circa 1997
This last one, the NeXT machine, is complicated. I never had a NeXT machine, but NeXT machines haunt our world. Like Lisp machines, like Smalltalk, their users were incredibly vociferous excited people who talked about using them in almost religious tones.
The difference is that NeXT’s OS went from being a somber lesson about being too ambitious to being one of the dominant operating systems in the world, and everyone still talks about it in religious tones.
I guess I need to explain.
on and off
In his commencement speech at Stanford University in 2005, Steve Jobs described taking a calligraphy course as an undergraduate at Reed College in Oregon. “I learned about serif and sans serif typefaces, about varying the amount of space between different letter combinations, about what makes great typography great.” He went on:
None of this had even a hope of any practical application in my life. But ten years later, when we were designing the first Macintosh computer, it all came back to me. And we designed it all into the Mac. It was the first computer with beautiful typography.
Yesterday I booted up the emulator for the Xerox Alto. The Alto was arguably the first modern general-purpose computer — a big screen, modern software, and you used a mouse to point. It was never generally available but it was the Velvet Underground of computers, in that everyone who saw it went on to make their own computer industry. As I wrote above, when Apple went to Xerox to license its technology for MacOS, it was copying ideas that had been created on Alto computers.
When you boot up the Xerox Alto the fonts are right there, listed: Helvetica was a first-class citizen of that operating system, many years before that first Mac pinged awake.
After he was fired from Apple, Jobs went off and built NeXT. The NeXT computer was a hodgepodge: Bits of Alto, from Xerox research; bits of UNIX from Bell Labs, the research arm of the giant US telephone monopoly. Its core language, Objective C, was an unholy union of Xerox “object oriented” approaches and the Bell Labs “C” programming language. They also built a tool to ease the programmer’s labors—a software development tool called Interface Builder. That started as a government-funded project in France, was turned into a feature of a version of the LISP programming language that ran on Macintoshes, and then found its way to NeXT. Its direct descendant is what you use today to build iPhone apps.
Many roads going back through computing history lead back to Steve Jobs, or pause along the way at his office. But they don’t stop there. They go back to INRIA’s labs in France, back to Bell Labs in New Jersey, MIT in Massachusetts, back to Xerox’s Palo Alto Research Center—a surprisingly short drive from One Infinite Loop, Apple’s headquarters in Cupertino.
And further back still: To people reviewing each other’s album collections, back to the post office, the railway systems, radio networks, sporting events. People building roads. Networks are natural things.
In their day NeXT systems were seen as insanely expensive, bordering on pretentious; they were never intended for the masses but had a strong focus on the academic market. NeXT looked down on the world of popular computing from a very high window; meanwhile, Windows sold hot dogs on the street. (“Write software for it?” said Bill Gates of the NeXT. “I’ll piss on it.”)
You can do good work in high towers. The World Wide Web was bootstrapped on a NeXT machine. The videogame “Doom” was written on NeXTs. And famously, Apple bought NeXT in 1997 for $400 million ($50 million of that in debt), and just as famously Jobs began to overtake Apple, to make it his own again. It was not smooth. When they turned NeXTStep into MacOSX people were baffled. They made videos to complain—years before YouTube, videos that you had to download or stream from random websites over slow connections. The one below as a favorite. A friend downloaded it so that we could watch it together on his laptop.
The organized environment of MacOS9 was being taken away. We’d all been moved to a new house in the middle of the night. What is this? Why did they change it? What is it for? It wasn’t clear. Because of the iPod and iTunes, Apple was now discussed as a music and entertainment company that also did computers. What was this? What was it for?
Then came the iPhone. At first there was no App Store, no way to run your code within it, and people railed and gnashed their teeth. But then there was an App Store. The way you built apps was with Objective C and the Interface Builder. No other approach was permitted. There was gnashing of teeth, but less so. Not only was the NeXT ideology successful, but it was enforced. Aligning yourself with its methods was the price one paid to participate in an enormous cultural landrush. Today Apple is worth 1,000 times as much money as it paid for NeXT.
If you are reading this piece on an iPhone, a Mac, or an iPad, you’re using tools built with Interface Builder, all the way back.
“Good artists copy,” Jobs once said, misattributing it to Picasso. “Great artists steal.” Perhaps a more accurate statement would have been: Great popularizers license.
When people get rich it always ends up sounding like destiny. And the actual narratives sound too small, too fragile—and impossible to reproduce. Which makes for a bad story. Good stories are ones you can learn from. Imagine standing in front of the graduating class of Stanford and saying,
Man, I just don’t know. Wozniak wanted to show off for his nerd friends. I was ready to sell to Commodore. Xerox was so focused on the 1990s they forgot about the 1980s. NeXT, we just got further and further into the quagmire. Pixar, before Toy Story, it was the only hardware company less successful than NeXT. The iPhone launched without an App Store. But people were drawn to me, and I told them what they needed to hear in order to make each other rich. So do that: Go out there and tell people what they need to hear in order to make each other rich. When something works say that was the plan all along.
That would be a terrible commencement speech.
Technology is What We Share
Technology is what we share. I don’t mean “we share the experience of technology.” I mean: By my lights, people very often share technologies with each other when they talk. Strategies. Ideas for living our lives. We do it all the time. Parenting email lists share strategies about breastfeeding and bedtime. Quotes from the Dalai Lama. We talk neckties, etiquette, and Minecraft, and tell stories that give us guidance as to how to live. A tremendous part of daily life regards the exchange of technologies. We are good at it. It’s so simple as to be invisible. Can I borrow your scissors? Do you want tickets? I know guacamole is extra. The world of technology isn’t separate from regular life. It’s made to seem that way because of, well…capitalism. Tribal dynamics. Territoriality. Because there is a need to sell technology, to package it, to recoup the terrible investment. So it becomes this thing that is separate from culture. A product.
I went looking for the teddy bear that Tom had given me, the reminder to be a child sometimes, and found it atop a bookshelf. When I pulled it down I was surprised to find that it was in a tiny diaper.
I stood there, ridiculous, a 40-year-old man with a diapered 22-year-old teddy bear in my hand. It stared back at me with root-beer eyes.
This is what I remembered right then: That before my wife got pregnant we had been trying for kids for years without success. We had considered giving up.
That was when I said to my wife: If we do not have children, we will move somewhere where there is a porch. The children who need love will find the porch. They will know how to find it. We will be as much parents as we want to be.
And when she got pregnant with twins we needed the right-sized doll to rehearse diapering. I went and found that bear in an old box.
I was sitting on Tom’s porch in 1992 when he handed me that toy. A person offering another person a piece of advice. Life passed through that object as well, through the teddy bear as much as through the operating systems of yore.
Now that I have children I can see how tuned they are to the world. Living crystals tuned to all manner of frequencies. And how urgently they need to be heard. They peer up and they say, look at me. And I put my phone away.
And when they go to bed, protesting and yowling before conking out, I go to mess with my computers, my old weird imaginary emulated computers. System after system. I open up these time capsules and look at the thousands of old applications, millions of dollars of software, but now it can be downloaded in a few minutes and takes up a tiny portion of a hard drive. It’s all comically antiquated.
Moore’s law, the speed at which technology moves forward, means that the digital past gets smaller every year. What is left are the tracings of hundreds of people, or thousands, who, 20, 30, 40 years ago found each other and decided to fabricate all this digital stuff. This glittering ephemera. They left these markings and moved on. Looking at the emulated machines feels…big, somehow. Like standing at a Grand Canyon with a river of bright green pixels running along the bottom.
When you read oral histories of technology, whether of successes or failures, you sense the yearning of people who want to get back into those rooms for a minute, back to solving the old problems. How should the mouse look? What will people want to do, when we give them these machines? How should a window open? Who wouldn’t want to go back 20 years—to drive again into the office, to sit before the whiteboard in a beanbag chair, in a place of warmth and clarity, and give it another try?
Such a strange way to say goodbye. So here I am. Imaginary disks whirring and screens blinking as I visit my old haunts. Wandering through lost computer worlds for an hour or two, taking screenshots like a tourist. Shutting one virtual machine down with a sigh, then starting up another one. But while these machines run, I am a kid. A boy on a porch, back among his friends.