What even IS this? Why tech companies are still failing us

Sam Ladner
5 min readDec 3, 2018

--

Why do we know so little about the social implications of technology? It plays a starring role in everyday life, as essential as food, shelter, and clothing. A huge share (70%) of Americans use social media, and even 65% of senior citizens use Facebook — that’s more than the number of people who eat family dinner at home, attend church, or have a pet.

Woman in kitchen (Art Institute of Chicago)

Yet, we know so little about technology’s impact on everyday life. We are only just now recognizing problems like coordinated disinformation, breaches of personal data, and algorithmic discrimination. Clearly, technology companies are falling short on understanding the social implications of their tools before and after they build them. But why? Why are tech companies failing us?

Sadly, it is all too predictable that technologists underestimate, misjudge, or otherwise underappreciate how humans will interact with their technology. This is for one simple reason: engineering, as a discipline, does not bother to ask: “What is this?”

Invention of The Monsters (Art Institute of Chicago)

Engineers are not scientists, much less social scientists. They typically have no knowledge of basic human behavior such as loss aversion or impression management, even though these are the building blocks of social interaction — and entry-level knowledge for social scientists.

Engineers could ask, “What is this?” but instead choose to ask: “Does this work?”

“Does this work?” underpins research within tech companies. Once upon a time, tech companies hired engineers they called research scientists and stuck them in labs to tinker endlessly with pieces of hardware and scraps of computer code. Even today, there are over 7800 job postings for “research scientist” on LinkedIn, which are typically engineers or computer scientists. A posting for an Uber research scientist intern is instructive. In addition to having a Master’s degree in a “technical field,” the intern is also encouraged to engage in “risk taking” and to “turn the dreams of science fiction into reality.” Another job posting for a research scientist at Facebook asks for skills in the scientific method, but then specifically narrows that down to “evaluate performance and de-bug.” In other words: Does this work? Notably not mentioned: the ability to develop basic knowledge.

Academics would see much of this activity as more akin to prototyping than to scientific inquiry. Indeed, these engineers produced many technology prototypes, but not much in the way of generally applicable knowledge, or what the rest of us might call “science.” In other words, they never seem to stop and ask, “What is this?”

Today, tech companies need to ask things like “What is a digital public sphere?” and “What is the nature of privacy?” and “What is artificial intelligence versus human intelligence?” Tech companies need typologies of human-computer interactions, motivations, fears, and human foibles. They need to create a system of knowledge around key questions of technology like artificial intelligence and social media.

Some argue that technology development doesn’t have time for “understanding,” that asking “What is this” takes too long and is too expensive. But this is a false economy. Philosopher Martha Nussbaum tells us plainly that we need that understanding, not for understanding’s sake but because it guides our planning:

“Understanding is always practical, since without it action is bound to be unfocused and ad hoc.” — Martha Nussbaum

In other words, if you don’t know “What is this” you’re probably going to build the wrong thing.

We can see this pattern of building the wrong thing in technology, over and over again. The term “user friendlywas invented way back in 1972. Curiously, “user hostile” wasn’t invented until 1996, just before Microsoft’s infamous Clippy appeared in 1997. Clippy’s abrupt entre onto the desktops of the world indicated that technology “researchers” had no idea what they had made. Word famously exploded from what appeared to be a digital typewriter, to a swollen behemoth that did everything from create a newsletter to automate mailing labels. Pick a lane, people. Clippy was there to tell users how to make Microsoft Word work, but no one bothered to find out much less explain what Microsoft Word actually was. Word is still so swollen that a new user today can credibly ask “What even IS this?”

Flash forward to today, and the so-called “lean startup” approach to building technology is really just a faster, even more facile way to ask “Does this work.” In reality, tech companies still don’t know, “What is this?” even after they’ve built a working prototype.

In my former role as a hiring manager at a major tech company, it took an average of 100 days to hire just one ethnographer and more often than not, the job remained open much longer than that. These are the very people who can tell us, “What is this?” The demand for these social scientists only grows. Yet, the tech industry as a whole has not yet figured out they need to ask “What is this?” before they build something.

Were tech companies to ask, “what is this,” they would learn the basic properties of their tools, their coherence, intelligibility, performance, and affordances. Instead, they are fully occupied with “does this work,” and create horrific blights on our collective consciousness like Tay, the racist AI Bot on the relatively innocuous end of the scale, and Compass, the racist parole algorithm at the full-on evil end of the scale.

Technologists do not know what they do not know. Ethnographers hope for the day when they can just ask “What is this” without worrying about whether it works, because it doesn’t even exist yet. But tech development continues apace.

It’s time for ethnographers to stop this sad venture, and instead insist on asking: What IS this? Before another Tay, before another Compass. Technologists too must take responsibility because if we don’t, the 21stcentury will become even more technocentric, and even less intelligible. Let’s find out what’s going on before we build anything else.

Originally published at www.samladner.com on December 3, 2018.

--

--