The Iron Wireframe
Unpacking the military ancestry of user experience
The roots of User Experience are impossible to disentangle from the military-industrial research of the Cold War. Interactive computing began as a defense-driven endeavor; making it otherwise took deliberate effort.
So why do we never hear about it?
In histories of UX and interactive computing, Xerox’s Palo Alto Research Center (PARC) plays a formative but ill-fated role as the Mount Olympus from which Steve Jobs ‘stole’ graphic user interfaces, the mouse, and a great many other ideals of interaction that became the Macintosh. In the space of a decade, PARC also created laser printers, computer-generated bitmaps, WYSIWYG text editors, and Ethernet.
It seems as though much of today’s office came from PARC. But where did PARC come from?
In 1970, a combination of growing opposition to the Vietnam war, and the militarization of all ARPA research, meant that an extraordinary collection of talent in the new fields of computer networks and interactive computing were looking for greener pastures at a time when one corporation decided to provide the greenest pastures imaginable. [Rheingold, Tools for Thought, Chapter Ten]
PARC didn’t just emerge fully-formed from the forehead of Athena; much of its success came from the visionary recruiting and research management of Bob Taylor, previously the director of the Information Processing Techniques Office at the Defense Department’s Advanced Research Projects Agency (ARPA, now DARPA). But much of his success came from his ability to recruit from existing networks of discontented, smart people, many of them coming out of ARPA-funded projects. Much of the basis for PARC’s remarkable run of innovation came directly out of dissatisfaction with militarized computing, defense intelligentsia, and a desire for something different.
But after the second trip that I made to Vietnam, I came back to ARPA, and I told Herzfeld, my boss, “We have no business being there. This is civil war between two factions, and the faction we’re supporting at least — I’m not sure about the other faction — does not believe in meritocracy. The people who are running it, that government, are just relatives of other people.” And I said, “This is not good.”
Going to PARC meant making a choice to work on the office of the future instead of tomorrow’s guided missile control system.
Of course, academic histories of the field are more likely to take this into account (See Paul Edwards’ The Closed World for a military-driven history of computing, Dealers of Lightning for PARC, and Fred Turner’s From Counterculture to Cyberculture for more on how the late sixties refusal to ‘fold, spindle, or mutilate’ affects the digital present). But the lords of SEO have seen fit to elevate histories written by companies or designers as a sort of promotional content, which (understandably) tend to shy away from tangled nuance or inconvenient truths: “The origin of user experience in our fear of nuclear annihilation” probably doesn’t drive the sorts of clicks they want. While sensible from a self-interested perspective, this does mean means that when someone googles “history of UX” they get an account that goes from Taylorism to Toyota to Don Norman without so much as firing a shot.
There are other stories to be told, particularly if one shifts the perspective from “how we have looked at the way people DO things” to “how we have looked at the way people make sense of information?”
That leads you to stories about cybernetics, the birth of cognitive science, Human Factors research, the discovery and measurement of task saturation and information overload in aviation, 1950s and 60s attempts at cockpit simplification that led to the Heads-Up Display, as well as JCR Licklider’s ideas of modeling, visualization and “Man-Computer Symbiosis.”
I read an article…that is now very famous, written by J.C.R. Licklider entitled, “Man-Computer Symbiosis.” And in this article he outlines how a human being and a computer can form an interactive partnership. When I read it, I just lit up…In terms of what I was going to do with the rest of my life, reading that paper by Lick pretty much determined it.


Or you could start with the SAGE console (Licklider worked on human factors for it), the first computer system that allowed real-time human interactions, and look back at the pre-history of interactive computing in radar systems, which not only inspired Douglas Engelbart (who was working as a radar technician on a pacific island in 1945 when he read Vannevar Bush’s piece “As We May Think” and got started on tools to amplify human capacity), but also the development of the trackball. (No wonder it was the input control in Missle Command.) Fittingly enough, Roy Ascott, godfather of interactive art, served in the RAF in front of a radar console.


Or you could look at the space program, simulators, and an ongoing tension between “machine-rating the men” and “man-rating the machines.” In other words, how much adaptation did the user need to undergo, via complex training or more far-out proposals for cyborg astronauts?
Bob Taylor, as it happens, got his start in advanced research when NASA hired him away from a flight simulator company:
I managed research in [manned flight control systems and flight displays, and simulation technology] while I was at NASA. One of the unsolicited proposals that came in was from a guy named Engelbart at SRI. I thought it was an interesting proposal and he came into D.C. on his round of looking for money, and we talked. I funded his proposal and the mouse was created by NASA funding. Most people don’t know that.
Arguments about power users and intuitive design echo back a long way, and for good reason: “Usability” is a very different project when your user will have thousands of hours of simulator time getting ready for a few minutes of lunar landing, versus an untrained person sitting down at a device they’ve never operated before. User experience as a concept could easily trace its history to the birth of personal computing, to the project of changing the experience of computing from the work of a highly-trained experts (mathematicians, astronauts, NORAD technicians) to tools for everyone.
You see hints of that in discussions of computer tools in the 1980s:
What Brenda was getting at seemed so strange and so counter to everything I had been taught that it took a while for it to sink in: In essence, she was saying that when it comes to computer software, the human habit of looking at artifacts as tools can get in the way. Good tools ought to disappear from one’s consciousness. You don’t try to persuade a hammer to pound a nail — you pound the nail, with the help of a hammer. But computer software, as presently constituted, forces us to learn arcane languages so we can talk to our tools instead of getting on with the task. [Rheingold, Tools for Thought, Chapter 12]
Being that I don’t have advanced degrees in science and technology studies, I’m not familiar with the scholarship that doubtless exists on this. (If you know about it, send me messages telling me about interesting books and papers.) But that’s also the point! I only thought to dig into this because I’ve heard Molly Steenson talk and knew the right breadcrumbs to start with. The stories we tell ourselves matter, especially origin stories. I suspect it’s partly a factor of the field’s desire to portray itself as something new and different from HCI and Human Factors, something more holistic and humane. But in so doing, UX seems to have forgotten (some of) its roots.


The history of user experience is not one of pleasant isolation from the military applications of the researcher and designer’s work, but instead one of dynamic tension: sometimes funded by, sometimes reacting against, almost always working from a baseline created by military R&D funding.
In a world of surveillance startups, drones, and DARPA-funded maker-spaces, it’s a legacy we ignore at our peril.