Redesigning technology to serve people, not ads

An interview with Nine Dots Prize winner James Williams

Dee Keilholz
The Connection
11 min readNov 9, 2017

--

Illustration by Kurt Iverson of Domain7, exploring James Williams’ idea that perhaps our technology has been designed to overpower us.

For most people, the relationship we have with our gadgets, especially our smartphone, is, well, complicated. We love our iPhone. We hate our iPhone. It’s a necessary item. It’s a ball and chain. It’s useful and a distraction. And when the constant barrage of notifications, messages and social media feeds keeps us from the things we actually set out to do, a lot of us feel not only distracted, but also guilty. We blame ourselves for lacking the willpower to be more focused. But what if we were simply fighting an unfair fight? What if the forces we’re up against are much more powerful than us — and designed to hog our attention?

That’s exactly what Oxford University researcher James Williams believes to be true, and it has turned him into a staunch advocate for shifting the conversation towards a critical look at the systems that drive technology today.

About ten years ago, right around the time the first iPhone was released, James was working at Google. As an early adopter, he soon started noticing some unwanted side effects. As a person, he felt more and more distracted from acting on goals that mattered to him. As a member of the tech industry, he found himself increasingly disenchanted with a system designed to keep people hooked to their devices in a battle for their attention.

James left his job at Google and started searching for answers to some of the questions that were nagging at him, namely: How can we create tech that is aligned with people’s intentions, values, and well-being?

Today, he is a doctoral candidate at the Oxford Internet Institute, and he recently won the prestigious Nine Dots Prize for his essay on freedom and persuasion in the attention economy.

James graciously shared some of his thoughts and insights with Domain7. We chatted about attention as a beleaguered commodity in today’s economy, the misalignment between tech companies’ (often noble) goals and how success is measured, and how we can create a better system.

Part of your theory in your prize-winning essay is that the main goal of technology today is to capture our attention. How is that?

In today’s attention economy, the limiting factor in how we consume information is our attention, and therefore most of the technologies we’re using today are designed to compete for that scarce resource as effectively as possible. There’s this race going on between tech companies today to develop products that do one thing exceptionally well: grab people’s attention. I think that becomes most apparent when we look at how tech companies today measure success. The metrics in place today are all based on little momentary actions: impressions, clicks, engagement.

There’s a disconnect or misalignment between the goals tech companies have and what people’s goals are in real life. For example, nobody I know aspires towards maximizing the amount of time they spent on Facebook or Twitter every day, but that’s what companies like Facebook or Twitter define as success.

Why should we be worried about technology competing for our attention? What’s the harm?

Back when I worked for Google, I realized that there is more technology in my life than there had ever been. I noticed that because of all that technology, I had a much harder time to focus on the things I really wanted to do. I felt distracted, but it was more than just distraction. It felt like my attention was constantly being hijacked.

I think everybody probably had this experience: you pull out your phone to do one thing, and 20 minutes later you’ve done five other things, but not that one thing you set out to do. The iPhone came out in 2007. I was an early adopter, and I just kept having experiences like that. Smartphones have really opened up this moment-to-moment world for us: a never ending flow of potential rewards, like a slot machine, and I think there are some indirect, subtle, persuasive and seductive types of harm in that. We think of these distractions I just described, through platforms like Facebook, Twitter or through online ads, as momentary annoyances, but I think it’s much more than that. It’s distraction of actions. We’re being distracted from what we actually want to do. Over time that can actually change our behaviour and undermine some of our core capacities: our ability to reflect, to remember things, to have metacognition, thinking about thinking.

That distraction of action impacts us on an individual basis, but I think we also need to start looking at how that impacts us as a society. If we’d ever get to a point where we couldn’t make decisions anymore according to our free will, that could really change the political discourse and undermine some of the assumptions democracy is based on.

So, what’s the solution? Should we all close our social media accounts and get rid of our electronic devices?

No, that’s not at all what I’m suggesting. I’ve always been very pro-technology and I still am. The answer is not to give up on technology, but we need to start thinking about how we can nudge technology into a direction where people’s intent is at the core, and not people’s attention. We need to start to think and talk about how we can change the current system to serve the people who use that technology better.

It’s interesting that you say technology today doesn’t have people’s goals in mind, given that creating “user-centric” technology seems so front-of-mind in design today.

I think user-centric design — researching and focusing on the user experience — is a step in the right direction. I’m glad that’s part of design today, but I think it’s only half the solution.

User-centric design today isn’t based on a holistic model that is aligned with people’s well-being, because the different elements we’re looking at today are tied to whatever persuasive engagement a company is measuring. For design to truly have the user’s intent in mind, we need to start shifting the incentives of design at an organizational level. This will include, but not be limited to language, metrics, KPIs (Key Performance Indicators), the goals of design, business models, design processes and models.

It seems to me what you’re saying is: we need to create change on a systemic level. What are some of the ways we can make that systemic shift happen?

We need to rethink business models, for one. More specifically: advertising. We took this pre-internet, pre-digital model of advertising, and uncritically applied it online. Technology platforms are fueled by advertisement. It’s the dominant business model for online services today.

I have yet to talk to a person who can give me a straight definition of what advertising is, but I think at a high level it’s about different types of persuasion. Some are much less questionable than others, like search ads: “I’m looking for x and somebody has x to sell.” With search ads, there’s an alignment between people’s goals and the goals of the system.

But then you also have brand advertising for demand creation, which is solely about capturing your attention and keeping it. To me, the fact that both types of ads fall under the same umbrella term of advertising is really strange. They’re very different — logistically, but also ethically. I think we really need to figure out how to incentivize companies to move toward intent-based advertising as opposed to attention-based advertising.

I would also like to see us develop and intentionally fund non-advertising-based business models that are scalable. There’s been some experimentation with that — Benefit Corporations for example — where the focus is on intent-alignment and value creation. It remains to be seen if those experiments are going to be successful.

You mention different types and layers of persuasion. Is a lack of clarity around what persuasion is part of the problem?

Absolutely. There’s no clear line between ethical and unethical forms of persuasion, because it’s such a fragmented term. There’s coercion, distraction, manipulation, and they are all different, but we keep talking about persuasion in this really binary way. It makes it really hard to clearly identify when people are being coerced and when people are just being “nudged”.

Every time I give a talk to designers about this issue, the questions I get asked over and over are: Where is the line? How far can I go down this road of persuasion before it starts to become unethical? That lack of clarity is clearly something designers think about and struggle with — which is why part of the challenge is to find more structured ways to talk about persuasion, to clearly identify different types of persuasion and their ethical boundaries. We need to develop a clear language to talk about persuasive attempts, persuasive designs.

How can we start creating a language around persuasion that will help us keep design aligned with people’s intentions?

Here’s one idea that I had for a model: On the x-axis we have “the degree of goal alignment”, so how aligned is your technology with your goals? The y-axis is the degree of restraint. How much does it restrain you from making your own decision?

We could then say that a technology with a high degree of restraint and a low degree of goal alignment would be a seductive technology. Whereas you might say that a technology with a medium amount of restraint but a high degree of goal alignment might be a guidance technology.

This isn’t a fully developed model by any means. It’s just a first stab at looking into common ways to talk about the aspects of persuasion.

Why do you think the conversation about ethics and persuasion isn’t more mature today? Is there a resistance or hesitance within the tech sector to put ethical guidelines in place?

Within the tech sector, there is a bit of a view of ethics or regulations as a barrier to innovation. It’s seen as just another box that needs to be ticked. It’s bureaucracy, a brake pedal. It’s not really seen as a steering wheel. On the other hand, society — or at least parts of society — look at the tech industry, thinking that tech companies are evil. At the end of the day, the task at hand isn’t to blame one party or another. That’s missing the point and it results in these outrage cascades the attention economy feeds on. You know, posts like “12 reasons why this company is evil”.

By in large most people in the tech industry are really well-intentioned, thoughtful people. I don’t know a single designer who went into design because he or she wanted to make somebody’s life worse. Most designers have the user’s interest at heart, but I think there’s a gap between what the goal is supposed to be and how designers are being held accountable for reaching that goal. The designers are the player, but the players aren’t the problem. The game is the problem. The systems, the processes and the incentive structures currently in place incentivize a certain type of design that isn’t aligned with users’ goals.

Is there anything happening right now to nudge the system towards a more intent-based approach?

There are some interesting developments going on, especially in the ads and ad blocking space. There’s the Acceptable Ads Committee, trying to create new standards for ads, and I recently read that Google is thinking about incorporating native ad blocking functionality into Chrome, so we’ll have to see how that unfolds. I don’t know if it’s all moving in exactly the right direction, but it’s moving.

Is there anything you’d like to see happen?

It’s interesting, that most people take steps such as installing an ad blocker, because they just find ads annoying, but I think the most problematic ads are the ones that people don’t even perceive as annoying, because they don’t even enter your consciousness. The ads that work on a subconscious level, that happen peripherally — like banner ads. Those are the kind of ads we should be most worried about. I would love to see ad blockers that can actually distinguish intent-based ads from attention-grabbing ads. That would be a great step towards nudging the industry into the right direction.

One of the big failures of measurement in advertising — and digital technology at large — is the failure to really measure and understand user intent at higher levels.

For example, I am searching for red running shoes. Maybe that’s because I want to start running. Or maybe my main goal is to lose weight, or prepare for a marathon. I think technology right now isn’t very good at capturing the intent behind the intent, the why behind the what. We have to rethink what the purpose of advertising is in an information-abundant world, and how we can realign measurement and persuasive engagement with people’s intent. In an ideal world, advertisement would be sponsored support that helps us reach the higher goals we have set for ourselves.

On an individual level: What steps can people or end users take?

The first step is awareness. Most people are not aware that the devices or apps they are using are actively trying to capture and keep their attention. We need to learn to value our attention for what it’s worth and realize that we’re paying for free technology with our attention, so it’s not really free. There are some apps today that can help you be more aware of how much time you spend on your phone. There’s this great app called Moment that helps you track how much time you spend on your phone using which apps, and if you like you can set daily limits. There are some other helpful resources on Time Well Spent. It’s a project I co-founded that aims to steer technology design towards having greater respect for users’ attention.

I’m always a little hesitant to talk about what the end user can do, because I really want to avoid a discourse where we put questions of attention primarily on the people using technology. Most news articles written about distraction through technology seem to come to the conclusion that it is the user’s responsibility to work harder, to self-limit, to set boundaries. The truth is that there’s a system out there of thousands of the smartest people trying to undermine your willpower and commandeer your attention, but the answer alway seem to be, “Well, you should have more willpower.” That seems paradoxical to me.

At the end of the day, I don’t think user behaviour is the answer to developing long-term sustainable solutions, because they aren’t the ones who hold most of the power. Like I said, I think we need to start finding smart solutions that counteract a system that is trying to capture and exploit people’s attention.

If you want to learn more about James and his research, check out some of these resources:

  • Time Well Spent | A project that aims to steer technology design towards having greater respect for users’ attention.
  • CBC Business News | James talks about the growing disconnect between what technology is trying to accomplish, and our human goals.
  • CBC Radio Spark | An interview with James on the harmful effects digital technologies can have on our democracy and politics.

--

--

Dee Keilholz
The Connection

Editor and writer at @Domain7 + Lighthouse Labs graduate + cat wrangler