Computation After WWII
--
Written for 48–727: Inquiry into Computation, Architecture and Design at Carnegie Mellon University.
In his article, “As We May Think,” published in The Atlantic in July, 1945, Vannevar Bush pats himself and his fellow scientists on the back for contributing to the war effort, and wonders where they will go and what they will do after the war. Bush likely knows that the end of his military entanglement is near — the Reichstag has been captured and the United States will soon drop two atomic bombs on Japan. In fact, as a leader of the Manhattan Project, which developed the bomb, Bush significantly and personally contributed to the end of World War II. This military-technological achievement sits at the top of a pyramid supported by a mobilized American manufacturing effort which, according to historian Doris Kearns Goodwin, “directly consumed over one-third of the output of industry” [1]. The next step for the sciences, in Bush’s view, is to similarly mobilize information by harnessing emerging computational capabilities.
By means of example, Bush outlines photography as a material process, and claims that “Certainly progress in photography [technology] is not going to stop” [2]. Chemical processes, optical projection, and microfilm are all “rapidly improving,” becoming more efficient and enabling advances in other fields. Bush connects photographic record-keeping to more general record-keeping, claiming that microfilm would enable the entire Encyclopedia Britannica to fit on “a sheet eight and one-half by eleven inches.” However, this logarithmically increasing proliferation of material information, coupled with increasing specialization in the sciences, poses problems for society. How to make use of or even find worthwhile, relevant research among all the noise? Bush argues for machines (specifically computers) to again come to the aid of humans in this problem. But first, a new way of thinking about the role of computers is required.
Computers of the day did just that — they computed*. Bush describes mathematical processes which computers perform admirably: “Adding a column of figures… was long ago properly relegated to the machine.” A human enters data, the machine computes, and it returns transformed data to the human. But if the machine has a means of storing and retrieving data, akin to a human’s ability to remember facts or events by association, it becomes much more powerful. Bush uses the word “memory” three times in “As We May Think,” and the word “memex” to refer to a machine that “stores all [of a person’s] books, records, and communications an enlarged intimate supplement to his memory.” In time, however, memory became not only the dominant metaphor, but the term itself for how computers store and retrieve data. The year after Bush’s article, 1946 saw the development of RCA’s Selectron Tube, a device with 4,096-bit storage capabilities [3], a figure which has scaled year after year in relation to Moore’s Law. In addition to becoming increasingly robust over time, computer memory in culture has created a feedback loop with understandings of human memory. Claude Shannon’s 1948 paper, “A Mathematical Theory of Communication,” attempts to codify units of information and their transmission [4], a framework applicable to human-to-human, human-to-machine, and machine-to-machine communication. Bush laid the plans for a utopian, global society where humans and machines could seamlessly cooperate and communicate in a digital Esperanto. Looking from 2016 it’s clear that his vision will not be fulfilled.
While certain technical problems — file transmission, storage size, interconnectivity — have fallen by the wayside, others stubbornly remain. The constant race to increase efficiency has resulted in an esoteric collection of legacy systems and formats that no longer meet norms and eventually lose support, families of dead Esperantos replaced by their newer, 2.0 cousins. Anyone whose parents or grandparents still have a collection of movies on VHS will recognize this phenomenon, as even today DVDs are growing obsolete. While Vannevar Bush’s “memex” would have been a useful general-purpose tool in 1945, one can imagine the corporation producing it eager to introduce a newer model, encouraging its customers to upgrade, trapping them in this cycle ad infinitum. How long since you last upgraded your smartphone? Isn’t it about time? Facing our frenzied rush to consume what has been produced for us, it’s worth considering again the impetus driving technology research and development.
J.C.R. Licklider addresses this topic in his 1960 paper, “Man-Computer Symbiosis.” While less visionary about the future than Vannevar Bush, Licklider makes several arguments in favor of technological progress. A common theme is time, or lack thereof. Tracking his own working hours, Licklider writes, “Several hours of calculating were required to get the data into comparable form… [after which] it only took a few seconds to determine what I needed to know” [5]. Later, he hypothesizes a military commander consulting a computer on the battlefield, “having to make critical decisions in short intervals of time,” and makes the case for speech-based human-machine communication. In both stories, humans are under pressure to produce results — their worth is tied to their work — and more efficient computational capabilities produce better results for both the humans and the powers they answer to. While the unnamed military commander’s allegiance is unknown, Licklider’s is an integral part of his biography, with MIT his springboard into a powerful Cold War-era military-industrial-academic complex which he remained a part of his entire life. Perhaps computation freed Licklider from the drudgery of wrangling data and rote calculations, but it only cemented his place in the pyramid of society, supporting the emergent militaristic Western superpower. Perhaps we know the military commander’s name after all.
While World War II was the last time the United States Congress officially declared war, military conflicts have been near constant in the decades since. After 9/11, George W. Bush (no relation) declared a “War on Terror,” promising the impossible: “It will not end until every terrorist group of global reach has been found, stopped, and defeated” [6]. 21st century technology has aided and abetted the Bush and Obama regimes overseas and, we know after the 2013 Edward Snowden surveillance revelations, domestically. Yet computation avoids the political. The dominant narrative of technology remains its positive impact on personal quality of life, even when the most innocuous-seeming ventures can be traced back to the U.S. military. As Lisa Otto points out, “Siri’s core software was created as a Department of Defense-funded project called the Cognitive Assistant that Learns and Organizes, or ‘CALO’” [7]. Vannevar Bush and J.C.R. Licklider’s visions of technological progress have transformed the relationship between humans and machines, and greatly impacted everyday life. The cost may be incalculable.
*Humans also computed, and were in fact known as “computers” alongside machines. As Vannevar Bush describes, paternalistically, during wartime this role was filled by “girls armed with simple key board punches” — also see N. Katherine Hayles, My Mother Was a Computer (2010).
- Goodwin, D. “The way we won: America’s economic breakthrough during World War II.” American Prospect (2001).
- Bush, Vannevar. “As we may think.” In The Atlantic Monthly, July 1945.
- Rajchman, Jan. “Early research on computers at RCA.” Metropolis et al. qv (1980).
- Shannon, Claude E. (July 1948). “A Mathematical Theory of Communication”. Bell System Technical Journal.
- Licklider, Joseph CR. “Man-computer symbiosis.” IRE transactions on human factors in electronics 1 (1960): 4–11.
- Bush, GW. Transcript of President Bush’s address to a joint session of Congress, September 20, 2001. http://edition.cnn.com/2001/US/09/20/gen.bush.transcript/
- Otto, E. Commanding Machines: Siri and Narratives of Human-Centered Design. Carnegie Mellon University, 2015.