Tech & Advertising Are Not Evil
Sweeping generalizations are misguiding public sentiment
The conversation around the attention economy has exploded. No doubt in part to the role of social media in the last US presidential election and Brexit, along with some amazing advocacy work from people like Tristan Harris and authorship from figures like Tim Wu.
We’re also starting to see mainstream journalists address the issue and thought-provoking films about how this culture of influence is impacting mental health — especially in the younger generation. Princeton just announced James Williams’ Stand Out of Our Light — a book all about the attention economy — will be a major topic of discussion on campus for the new incoming class.
It’s been almost 3 years since I first wrote about the attention economy. Since then, I’ve done a lot of work and speaking on the topic, and three pieces of confusion keep coming up. One of them is the role of mindfulness — I’ve addressed that in a previous article. Today, I want to write about the other two:
1. Technology is Not Evil.
I keep hearing the misconception that attention activists are rallying against technology. Technology is not the problem. In fact, I find it hard to pinpoint what we mean when we use this vague word. People in mindfulness communities say “tech is ruining society” just as often as people at tech startups say “tech is going to solve all our problems eventually.” Are they talking about every example of human creation? The potential of all human endeavor? Are they talking about computers in a general sense? Or are they talking specifically about an app on their smartphone?
The etymology of ‘technology’ suggests this term refers to the doctrine, discourse or theory of creating something with a certain level of art, skill, or craft. Okay, so technology is pretty much everything. You listen to music that was made with the technology of instruments, distributed with connected technologies, and played with the technology of amplified speakers. The chair you’re sitting on is technology. Hospitals are full of life-saving technologies. If you remembered to eat this morning, the tools you used to prepare breakfast are also technology. So it’s a bit of nonsense to brand technology as a whole with a label like ‘good’ or ‘evil’.
Tech isn’t the enemy, it’s just that we’re mostly using it to scale selfish values and simple organizational incentives into monsters we can barely control.
The real issue lies in how we apply technology: the values we manifest and scale exponentially, the ethics of design, and the simple incentives motivating us. Why are we creating a given technology? How might it be used? What would it look like if this particular design “ate the world”? Our lives and our society are complex and multi-faceted. As we create exponential technologies with simple, myopic incentives, we’re accidentally disrupting pillars of our civilization like mental health and democracy.
Tech isn’t the enemy, it’s just that we’re mostly using it to scale selfish values and simple organizational incentives into monsters we can barely control. We can’t put the genie back in the bottle, but if we can work on ourselves, our values, and our systems of creation, we may have a chance at redirecting this massive cruise-liner before it hits more icebergs (if they don’t all melt and drown us, first).
2. Advertising is Not Evil.
This second point of confusion is a sticky one. When I spoke about attention activism at Harvard last year, someone came up to me afterward and challenged me: “If those of us who are trying to do the right thing don’t advertise, how are we gonna get anywhere?” This comes up a lot. The assumption in the question is that I’m somehow rallying against advertising, as if I think advertising is evil and that no one should ever market anything. I struggled to articulate an answer in the moment, but the question lingered on my flight home from Boston.
I still don’t feel full clarity on this, but I know I don’t think marketing and advertising are somehow inherently bad or unethical. These catch-all terms cover any attempt to get the word out about any initiative, whether we’re talking about a new shop in your neighborhood, a homeless shelter, or the next shiny app coming out of Silicon Valley. As attention activists, we’re rallying against ‘manipulation’ more than ‘advertising’. There’s a blurry line between these two terms, but we can paint the extremes clearly.
As attention activists, we’re rallying against ‘manipulation’ more than ‘advertising’. There’s a blurry line between these two terms, but we can paint the extremes clearly.
If you’re letting someone know about your offering, putting a message in a public place where they can choose to ignore it or engage with it, that seems fairly ethical. Even if you’re making that message a little catchy, and putting it somewhere where you know your potential audience might visit, doesn’t seem like a problem to me. Unfortunately, this form of basic advertising is slowly becoming a thing of the past. Each element of the equation has become more powerful and more surgical. We now know much deeper ways to manipulate anyone with a brain and body.
In modern advertising, we use tactics to trigger people’s animal instincts in ways they can’t ignore. Maybe you’re using sexy objectified bodies doctored to unrealistic proportions eating almost-pornographic images of junk food (which would make those bodies even less possible). Maybe you’re using an artificially intelligent algorithm which auto-targets those ads to dieting communities, because it found those click-through rates to be through the roof. Maybe you’re triggering people’s in-group bias and negativity bias by posting an article full of moral outrage about people who are different from you. Maybe a political group paid for that article as a “native ad” — a form of high-performing paid advertisement disguised as content so people can’t distinguish fact from fiction.
Now we’re deep in the territory of manipulation, where individuals can scarcely control — or even detect — their response to such dark patterns of influence. Curiously, when I’ve spoken with baby boomers about attention activism, many say things like “Oh, advertising doesn’t affect me, I just ignore it.” Oh, how we’ve since cracked that blissful hubris of thinking we’re in full control of our bodies and minds.
When we’re influencing people by intentionally sidestepping their own judgment and awareness, it feels a lot more like manipulation. When we’re using short-term strategies to change people’s behavior in a way they might regret later, it should be hard for us to sleep at night. When we’re creating platforms to democratize tools to let anyone manipulate others for cheap without oversight, and scale those tools to billions of people, we need to slow the fuck down.
Most of us aren’t at the extremes. For many of us, the challenge lies in the space between. I see the nuance of this quite clearly in my work designing mindfulness-based technologies. I tell my partners time and time again; just because we’re helping people meditate, doesn’t mean we’re inherently ethical. No matter what you’re selling, none of us are afforded the right to say whatever we want to market our wares just because we think we’re doing the right thing.
Join the Cause!
If you’re inspired at the intersection of mindfulness, technology, and design, I send stream-of-consciousness thoughts every week or two on the topic — you can reply directly and we can discuss.