Maybe It’s Not Just Addiction

M Giles Phillips
HackerNoon.com
4 min readMar 12, 2018

--

Technologists often oversimplify. Consider the contemporary tendency to describe all types of heavy smartphone usage as addiction. Smartphones and apps certainly have some built-in addictiveness thanks to a number of surreptitious design patterns. But addiction is not the whole story. And in some cases, I’d argue it’s the wrong story altogether.

Users are not just addicted: they are also (and more often) being vigilant.

Vigilance is the allocation of significant attentional resources to perform sustained watchfulness. This watchfulness is extreme and associated with self-preservation; for example, a zebra keeping eyes and ears open for predators while also grazing. Animals who aren’t at the top of the food chain spend considerable time and energy being vigilant.

We humans are of course at the top of the food chain. It is not the physically embodied self that we seek to preserve by vigilantly monitoring our phones. Rather, it is the digitally social self, comprising the increasingly numerous facets of our own sense of self (and self-worth) that are established and maintained through our ongoing efforts to socialize via mediated experiences.

Image from Life Without a Smartphone

That maintaining these facets of ourselves is so crucial to our sense of wellbeing and of healthy social connectedness should be obvious, but a specific example is likely instructive. Socialization amongst humans involves a number of mechanisms, notably Identity Performance — an iterative process wherein we do something, observe how people respond, and then adapt accordingly. Identity performance provides fluid and instantaneous social feedback that helps us cultivate our understanding of how we should act but also who we are — and it happens online just as readily as it happens in the physical world.

Unlike in the real world, though, inbound mediated communications and social responses are not instantaneous and can come at any time on our smartphones. Responses are easy to miss: people must commit sustained partial attention to monitor for cues that someone may have said something important to them, or offered a crucial response to a post or a message.

Thus we become vigilant, feeling extremely watchful over our devices, checking for notifications even when we didn’t hear or see one come in, every day. Addictive design patterns may keep you on your device for longer than you’d planned, but vigilance is most often what causes you to look at your device in the first place.

Why haven’t technologists seen this? For one, addiction is a generative metaphor that causes us to see heavy device usage as some sort of chemical dependency problem. This makes it harder to “see” heavy device usage as anything else, much less something quite natural like vigilance.

Furthermore, our historical concept of vigilance in human-computer interaction has been far too limiting. It started in World War II, when the British cognitive psychologist Norman Mackworth noticed that Radar Operators suffered from decreased performance — more missed signals and more false positives — when situated for too long at the radar terminal. He called this phenomenon the Vigilance Decrement. The radar operator’s task was to sit down and stare into a small screen, visually scanning to watch for any blips that may appear. Mackworth found that limiting radar shifts to 30 minutes drastically improved Operator performance.

A WWII radar operator. Image from “Radar: The Silent Weapon of World War 2”. October 1945 edition of Radio News, archived here.

Vigilance became an applied area of study in Cognitive Psychology and Human-Computer Interaction, but peculiarly, only in situations exactly like the radar operator, where you had (1) a trained, professional user who was (2) situated at a physical terminal and (3) whose task involved visual attention generally committed to prevent harm from coming to other people.

Perhaps those conceptual constraints made sense back in the early 1940s, before everyone had a indispensable supercomputer in their pocket, nagging for attention. The idea of “personal communication devices” like smartphones was pure science fiction, and computers were only available to a subset of highly technical professionals who were doing crucial work. I believe that Mackworth’s radar operator became an unwitting generative metaphor himself, in that users who didn’t fit his persona couldn’t be seen as vigilant by researchers. Air traffic controllers, anesthesiologists, surveillance operators, navigators could all be vigilant, but everyday smartphone users could not. The older forms of vigilant use all dealt with preventing physical harm, not psychological harm.

In reality, everyday smartphone users are incredibly vigilant to avoid the psychological harm of faulty, poor, or missed socialization. This form of vigilance is a consequence of mediated socialization, and is pervasive on a scale unlike anything we’ve ever seen before.

In the case of “smartphone addiction” the technologist’s oversimplification has created a conceptual blindspot, causing us to miss potential solutions to the problem of device misuse. Fortunately, the work of Mackworth and others — notably Parasuraman — gives clues as to how we might better design to support vigilance:

  1. Offer users periods of rest. It’s important to recognize that these rest periods are more than simply disuse, they must be periods when the user is truly off-duty and feels secure that they don’t need to be watching for anything.
  2. Introduce variance into the signals that you flare. On a smartphone, signals will generally be audiovisual / haptic alerts and notifications. Make more important alerts more salient and easy to differentiate and detect by default. It’s also crucial to ask users what is most important — since we’re dealing with social experiences there is no one-size-fits-all solution to prioritizing signals.
  3. Be more intelligent about flaring. Limit the number of signals or notifications. Batch up similar ones. Don’t flare things that aren’t important into the notification layer of a device. No ads, no unsolicited usage prompts, et cetera. Only signal things that matter.

--

--

M Giles Phillips
HackerNoon.com

Designer, Founder @subforum, Head of Design @forge_AI