Rethinking a Most Awesome Vision of Artificial Intelligence.
Advances in STEM (science, technology, engineering, and mathematics) are said to be the elixir for nearly all global challenges and promises of the future.
Through STEM we have developed mechanical and computational systems that augment our knowledge acquisition, reduce global distance to nanoseconds, and accelerate innovation seemingly to the speed of light. Takeaway the STEM effect, and we must confront the reality that discovery and innovation are rooted in the question of what it means to be human. At the time of this writing, I am listening to the angsted yet hopeful Arvo Pärt’s “Mein Weg” (My Way), on Pandora. In that music is the cry for the future of humanity’s . That future can only be built with both STEM and non-STEM ways of reconciling, resolving, and building a new.
The Gap
There is a gap that occurs between research discovery, innovation, and getting the work out into the world. In the world of entrepreneurship that gap is called the “Valley of Death”. After spending an extended period of time transitioning a seed of an idea into a model, prototype, peer reviewed and published document the support and resources required — to “mind the gap”, the “valley of death” — are sparse, nearly impossible to access, politically guarded by research institutions, and inadequate to transition the work from the lab to the world.
Vannevar Bush had a somewhat pollyannaish vision for a platform that could support research development and transfer in the United States. His vision was introduced to the public in his seminal 1945 Atlantic Magazine article “As We May Think”. He proposed to transition United States military research priorities from building war machines to basic and applied research that would impact the quality of life for the common man. Bush’s vision professed an opportunity for United States dominance over the growing global research enterprise that was based on speculative examples from his lived experiences and his perceived need for a more efficient work environment. This was rather similar to Douglas Englebart’s later 1972 H-LAMT (Human using Language, Artifacts, Methodology, in which he is Trained) framework that led to the invention of such useful devices as the computer CRT screen, mouse, word processing, and the Internet. Today, most technology research continues to focus on optimizing work environments and building intelligent knowledge acquisition systems using variants of artificial intelligence.
Bush’s vision morphed into a research industry where higher education institutions, industry, and think tanks are tethered and sustained by federal funding agencies, corporate priorities, and philanthropic trends. U.S. frameworks for “what counts as research” is rooted in federal agencies that fund STEM agendas such as the National Science Foundation (NSF), Department of Energy (DOE), Defense Advanced Research Projects Agency (DARPA), and National Institutes of Health (NIH). Other non-STEM research areas such as the National Endowment for the Arts (NEA), National Endowment for Humanities (NEH), and Department of Education (DOE) take a secondary, if not tertiary seat in priority as indicated by federal resource allocations that fall far below those of STEM-focused agencies.
Vannevar Bush’s Clarion Call
Vannevar Bush’s 1945, “As We May Think”, was a call-to-action for science to “give man access to and command over the inherited knowledge of the ages”. As the Director of the Office of Scientific Research and Development, Bush was responsible for managing hundreds of American scientists in the science of warfare. He decreed that “men of science should then turn to the massive task of making more accessible our bewildering store of knowledge”. Bush’s call to action laments on the fact that solving problems for the common cause of war released scientists from the tyranny of the professional competition. It was “exhilarating to work in effective partnerships.” At the end of World War II he asked, “ What are the scientists to do next?”
“As We May Think”, painted a fantastical world of scientific and technical innovation as could be imagined within the constraints of a 1940’s mindset. Yet, the treatise predicted technical ideas that are commonplace today from calculating machines, relational databases imagined as mechanical switching machines, digital photography, credit cards made from punch cards with magnetic strips for identification inspired by magnetic recording systems of the time, and barcode systems built from dry photography, photocells, and electron beams. He did not know that many of the mechanical systems he professed would be the path to the future would be replaced with sophisticated computational algorithms. Algorithms that for us today are deeply woven into the very essence of our quality of life and what it is to be human. Bush imagined the Memex as “a device in which an individual stores all his books, records, and communications, and which is mechanized so that it may be consulted with exceeding speed and flexibility.” Partially computational and partially mechanical, the Memex was poised to replace the Dewey Decimal System with associative search algorithms. The basic bread and butter of Google search today. The Memex was also the precursor to our personal computing devices that we are dependent on like the very oxygen pumped in our lungs. Contemporary technologies that sprung-forth from his vision were inevitable. However, most of the knowledge management systems Bush imagined did not take hold until the 1990’s when the tools, techniques, and design frameworks to realize Bush’s ideas became more “accessible” to individuals who were privileged to be embedded in the academic/industry research enterprise. Some ideas from his vision remain speculative and are the subject of countless Ph.D. dissertations in the world’s top tier research universities.
In the 1990’s, I had the unique opportunity as a Black woman artist who worked creatively with technology to collaborate with research scientists at SRI International on a user experience prototype for the SRI speech recognition technology. I built a game application to teach young children of native Japanese tongue to speak English. I suppose the computer scientists at SRI International assumed that they would accomplish a great milestone if their algorithms could detect the rather difficult Japanese to English accent that perhaps they grappled with in their international collaborations. Interestingly, that SRI International speech recognition technology research became the foundation of Apple SIRI.
Vision of the future grounded by ineptitude and intolerance.
From what assumptions, stereotypes, and biases did Vannevar Bush imagine the future?
Vannevar Bush’s manifesto for science was for the benefit of scientists like himself, white men. It is laced with misogyny and racism. Some not-so-subtle other subtle offensive cultural expressions could be missed and/or dismissed by technology futurists and enthusiasts. Bush’s vision of the future is rooted in the desire to capture, process, and manipulate data that he calls “knowledge”. Bush imagined technologies such as the nodule attached to the scientist’s head to capture images and visualizations from his brainwaves. His description of the walnut sized lens strapped to the white male scientists’ head would empower that “scientist of the future” to move through the world and record what he saw to be worthy in secret without even an “audible click”. This relegates the quintessential vision of the future from the gaze of white men and their framework of that which is worthy of attention in the world. Throughout the article women, which he calls girls, as was the custom of his time, are pictured as lazy, unfocused, and error prone. A Vocoder would replace the stenographer’s languid female keystrokes made haphazardly as she “looks about the room and sometimes at the speaker with a disquieting gaze”. He further elaborates that “One might, for example, speak to a microphone, in the manner described in connection with the speech-controlled typewriter, and thus make his selections. It would certainly beat the usual file clerk.”
In the late 1990’s, I worked as a design researcher at the IBM Almaden Research Center in Silicon Valley for a lab focused on human computer interaction research. This lab held an annual consortium called “New Paradigms for Using Computing”. The consortium invited the Silicon Valley computer science illuminati to share their ideas of the future. The year I attended the consortium, Marvin Minsky, Oliver Solfredge, and John Anderson (the fathers of AI) lamented in a panel about the fact that AI had not yet advanced to the point of pervasive integration into society. Minsky angrily pulled a small tape recorder from his shirt pocket and clamored about his dismay that the recorder could not yet read his intentions to record his world.
Bush was interested in technologies that dealt with the enormous difficulty of logic and understanding cognition. He suggested that the computing machines of his time focused on equation solvers. But there were bigger broader opportunities in what he called equation transformers that would “rearrange the relationship expressed by an equation in accordance with strict and rather advanced logic”. Today, that concept would be equated to machine learning and artificial intelligence. In his enthusiasm of this concept, he says “Progress is inhibited by the exceedingly crude way in which mathematicians express their relationships. They employ a symbolism which grew like Topsy and has little consistency; a strange fact in that most logical field.” Topsy, a fictional character in Harriet Beecher Stowe’s1852 novel “Uncle Tom’s Cabin”, is a young Black slave girl who, asked about God, says “I s’pect I growed. Don’t think nobody never made me.” Topsy was described as an “imp of darkness” in Harriet Beecher Stowe’s seminal novel.Clearly, Bush was a product of his cultural time. And in those times, it was commonplace for white Americans to live, eat, and breathe the destructive ideology that Black Americans were less than human. That they were monstrous disruptors to be cajoled, feared, and blamed for inadequacies of dominant white American culture.
The Topsy caricature represented the white racists idealoly of the poor, uneducated, uncouth black child. This racists trope became a standard for minstrel shows and other television and film characterizations of the black child as ignorant, unkempt, and clownish from Sambo to the Little Rascal Buckwheat. The image at the top of this essay depicts the Duncan Sisters 1933 smash hit play “Topsy and Eva”. A vaudevillian romp promoted white womanhood as juxtaposed to stereotypes of black incompetence. (Jocelyn Buckner (2011), “The Angel and the Imp: The Duncan Sisters” Performances of Race and Gender”, Chapman University Digital Commons.) Topsy was recently reintroduced to the American viewing audience as a phantasmic monster child with large bulging eyes, hair akimbo, limbs twitching like the most advanced humanoid robot with daggered fingernails in HBO’s “Lovecraft Country”. In that story, the depiction of Topsy — borne from the venomous spittle of a racists white policeman — is a horrific monster realized to an extreme to draw similarities to the horror of the collateral damage in the racist destruction of the soul of young Black children.
“As We May Think” was groundbreaking in planting the seed for the United States and global research enterprise in which many of us thrive. Vannevar Bush’s vision was speculatively radical, despite its fault lines of misogyny and racism. As a technologist, I am enthralled with Bush’s naïve descriptions of technologies we use today and continue to grapple to invent. However, as a Black woman, I cringe at the use of sexist and racists tropes he used to validate his vision. A validation that we too often accept without thinking about the fallout of how it impacts the paucity of diversity, equity, and inclusion in our academic and industry cultures of innovation. Those cultures of innovation that form our lives and inform our futures.