Credit: Gaelle Marcell

Engage, Don’t Enslave

Exploring what it means to captivate app users in a healthy way.

Matt DeLaney
The Official Neura Blog
5 min readJun 6, 2017

--

Addictions tend to leave people worse off than they found them. The same is true for digital addictions. A longitudinal study published by researchers at Yale and U.C. San Diego, “Associations of Facebook Use with Compromised Well-Being,” reveals that using Facebook harms our mental health. As context, people spend an average of 50 minutes a day on the social media site, while dedicating a paltry 17 to exercise.

Reading those figures, I couldn’t help reflecting on my own habits. Even though I’m someone who strives for balance, I often yield to urges — checking my phone, watching untold numbers of sea otter videos, mindlessly swiping at screens — that don’t ultimately improve my life. Well, the otters may be an exception.

In any case, my goal here is not to judge Facebook, Google, or any other entities or persons. Rather, it’s to explore ethical approaches to how technologies engage us.

I often promote artificial intelligence that helps mobile app and IoT device makers boost end-user engagement. I want to elaborate on the ethos behind this. I’m not advocating for machine learning varietals that ensnare people in unhealthy behaviors. I’m pushing for innovations that, by engaging users, improve their lives.

I’m not advocating for machine learning varietals that ensnare people in unhealthy behaviors. I’m pushing for innovations that, by engaging users, improve their lives.

Defining what is and is not beneficial to one’s well-being is obviously a big challenge. Beer, for example, will never help my liver. But the emotional benefits of sharing an ale with a close friend often outweigh the physical risks. Point being, the way to improve one person’s life could differ profoundly from the way to improve another’s.

The responsibility of people to manage their personal use of technology is another important part of this discussion. For our purposes, I’m only focusing on technology design.

Above all, this post is an exploration of what it means for digital products to add to people’s health and happiness. Perhaps the best starting point in this exploration is to look at the history behind the methods used by today’s most engaging technologies.

The Backstory Behind Persuasive Consumer Technologies

Nir Eyal is the best-selling author of “Hooked: How to Build Habit-Forming Products,” a book that arms technologists with psychological tools to ensure staggering levels of user engagement. Tristan Harris is a former “design ethicist” at Google, who later founded “Time Well Spent,” a nonprofit that questions the integrity of some habit-forming technologies. The two men share something in common; they both studied at Stanford under B.J. Fogg, the father of a discipline known as “behavior design.”

Simply put, behavior design weds human psychology and technology. Author Ian Leslie tells the story of the field Fogg created in a lengthy article, “The Scientists Who Make Apps Addictive.” In the piece, Leslie takes us back to 1997 in Atlanta, where Fogg, a doctoral student at the time, gave a talk on how blending psychological insights with computers might be used to influence people’s behavior. Over years of continued research, this concept has grown into an area of study with far-reaching impacts.

For better or worse, Fogg’s ideas fuel some of today’s most pervasive habit-forming products. Indeed, many of these technologies cleave to one of the chief pillars of behavior design: “put hot triggers in the path of motivated people.”

For better or worse, [B.J.] Fogg’s ideas fuel some of today’s most pervasive habit-forming products. Indeed, many of these technologies cleave to one of the chief pillars of behavior design: “put hot triggers in the path of motivated people.”

Leslie points out that Fogg feels concern over how some of his students have applied the principles he teaches. In any case, Nir Eyal and Tristan Harris give us two contrasting responses to the same information.

Eyal, for his part, champions the use of a “variable schedule of rewards” — a system casinos use — to hook people to apps or websites. Such rewards may be a new social media like, notifications crammed on your phone screen in the morning, or new emails (or lack thereof) that pop up when you refresh your inbox. It’s the thrill of not knowing which reward, if any, awaits that keeps people glued to their apps.

Nir Eyal champions the use of a “variable schedule of rewards” — a system casinos use — to hook people to apps or websites.

Eyal further elaborates on his approach to behavior design in an interview with Ian Leslie, saying, “an app succeeds when it meets the user’s most basic emotional needs even before she has become aware of them…When you’re lonely, before you’re even conscious of feeling it, you go to Facebook. Before you know you’re bored, you’re on YouTube. Nothing tells you to do these things. The users trigger themselves.”

The study from earlier suggests that, at least in the case of Facebook, the app may be engaging the user, but it’s not meeting their emotional needs in a healthy way. This could explain, in part, why Tristan Harris decries the use of variable rewards. They turn our smartphones into slot machines, he argues.

Tristan Harris decries the use of variable rewards. They turn our smartphones into slot machines, he argues.

Harris sees the proliferation of addictive products as a systemic issue. Whether they like it or not, technology companies are caught in an “arms race” for people’s attention in which they try to maximize a user’s “time on site/device” at all costs. Unfortunately, in running hard after this goal, they unwittingly engage the user at the expense of the user’s well-being.

In a recent podcast interview, Harris discussed how even a meditation app, an ostensibly beneficial consumer product, could potentially detract from a person’s well-being if it leveraged manipulative behavior design techniques.

To put an end to this mad scramble for attention, Harris argues for reforms such as “de-coupling” revenue from time on site, getting phone designers like Google and Apple to limit addictive elements in our phones, and gaining consensus among tech companies to eschew design practices that “hijack” people’s brains.

Whatever your thoughts on variable rewards or the people for or against them, this foray into behavior design helps frame the discussion on AI and ethical engagement.

Steering Toward Better Approaches to Engagement

In increasing measure, data are giving creators of technology a deep knowledge of consumers. When leveraged by apps and IoT devices in tandem, this knowledge could certainly enhance a person’s life. Diabetes management, mind-body wellness, and home security are just a few areas that spring to mind. Even so, research shows that products forged with good intentions don’t always end in good outcomes.

As much as possible, when a person engages with their apps and devices, it should lead to life-affirming benefits, not digital addiction. Such benefits might include:

  1. Improved physical and mental health;
  2. Saved time, energy, and money;
  3. Improved relationships with people in the real world; and
  4. Entertainment and delight that do not undermine a person’s holistic well-being.

Of course, this isn’t an exhaustive list. It points toward something, not at it. And consumers share responsibility for how technology influences their behavior. But when it comes to the quest to win and retain users, a good rule of thumb for the creators of technology might be “engage, don’t enslave.”

--

--