Self-terminating technology & the return of the Sacred, Tristan Harris
Can our shared reality survive the onslaught of big tech?
At a time when existential threats loom large — and the need for a reasoned, pragmatic consensus arguably never greater — the tech firms carrying the conversation are profiting from our division. Through careful iteration and deliberate commercial policy, their business models have made us addicts of our newsfeeds and pawns in a game of polarisation. Yet these outcomes can only be self-terminating.
Tristan Harris has been called the “closest thing Silicon Valley has to a conscience.” After three years at Google as a Design Ethicist, he co-founded the Center for Humane Technology, whose mission is to re-align new technology with our ethical goals. Rolling Stone named Harris one of “25 People Shaping the World”, and he placed in Fortune’s 40 under 40 of 2018 for his work in technology. He hosts the Your Undivided Attention podcast with Aza Raskin, the co-founder of the Center for Humane Technology.
Perhaps more than anything else, the problem of sensemaking will define our civilisational course. Finding meaning, truth and action in our surroundings underpins everything we do. And in the digital era, a real solution can only start from the source code up.
With Rebel Wisdom’s David Fuller, Harris makes an impassioned case for a new approach to sensemaking. Attention is a ‘sacred’ — not an economic — space, and to act otherwise risks a further lapse into division and suffering. By a careful reverence for our capacity to pay attention, we can sidestep and solve the excesses of the online world, and find a healing path towards real communication.
See below for edited excerpts of Tristan’s interview.
David Fuller: A key part of sensemaking is understanding how we’re being manipulated. Why do you think that the sensemaking crisis is central? How would you frame it?
Technology has become the sensemaking instrument for three billion people — and I think that’s never been more true in a coronavirus era. Because with many of us stuck at home, we are peering through the glasses, the telescope, of social media to understand what’s happening in the world.
When I was at Google as a design ethicist, I made this presentation back in 2013 saying that never before in history have essentially fifty designers at three tech companies determined what two billion people’s attention is going to be on a daily basis. And that we, as those technology designers, have a moral responsibility in holding the kind of collective consciousness carefully — because you don’t have a choice. I mean, we can’t not hold it. We can’t just take our hand off the steering wheel and just let chaos rule.
And when you use algorithms, automated programmes, automated rules — like whatever gets the most clicks should go to the top of people’s feeds, or whatever gets the most likes or shares — they tend to select, as everybody now knows, I think, for outrage or the extreme. That’s what causes what we call the ‘race to the bottom of the brainstem’ for attention.
David Fuller: Social media has just raised the intensity of everything. Suddenly we’re even at [a point when] our conversations are becoming performative. We’ve got this sense of the hollowing out of the private sphere. Do you find that a useful metaphor?
The co-founder for the Centre for Human Technology has a metaphor of origami. Origami has these pre-folds. And if you fold along the pre-folds, you get a beautiful picture of a constructed object. If you don’t fold on the pre-fold, you get something that’s just kind of messy and doesn’t really work. And I think that when it comes to human nature, there are pre-folds to our needs and values and how we make sense of the world.
On the sensemaking side, there are pre-folds around trusting that, if people around me are saying something is true, I tend to believe that that thing is true. But social media is distorting all of those signals: all of those heuristics, all of those vulnerabilities of the human mind, at every single level.
And when the digital infrastructure eats the physical infrastructure, it takes over the the basis from which we’re making sense of the world: the way we’re thinking, the way we’re waking up in the morning, the way kids are developing, especially in a COVID era where kids are at home and they’re using digital technology for for many more hours per day. So, if we don’t have a good understanding of what the fit is between technology — sort of like a brain implant into a society — then we’re gonna get that wrong.
And if you’re gonna put a brain implant into a human being, you have FDA approval. When you take social media and you to the implant into a society, there is no FDA approval. We’ve never run this experiment before.
David Fuller: I’m seeing more and more that we’re losing all sense of a shared reality. We’re losing any sense of common purpose, which just seems like a fundamental problem.
Yeah. I think the simplest thing for people to get is that polarisation is profitable because these platforms make money from people’s attention. I mean, how much money have you paid for your Facebook account in the last few years? Zero. And yet the stock price is north of six hundred billion dollars. And the reason for that isn’t just that they ‘have your data.’ It’s that they actually need your attention.
The attention economy is a finite playing field. And the win-lose game that’s being played means that it gets more and more competitive: if I don’t get your attention, someone else is right there to get it. But now it’s gotten worse because, to get attention within that sphere, I have to use outrage. So both sides, the left and the right, are using outrage to get people’s attention. And then that just further cleaves the divides.
If you do not have a shared reality in a world of existential threats — where finding consensus on short timelines like with climate change are critical and urgent — that’s the self terminating system right there.
We really need a different kind of media environment that both helps us find consensus and then privileges action. And Taiwan is actually a good example of digital democracies that work. It’s not that social media is compatible with democracy. We have to ask the question, ‘What is the version of it that actually works?’
David Fuller: There’s a strong belief, especially on YouTube, that free speech must be an absolute. But is that possible in a world of finite attention?
This is one of the most critical issues of our time, and we’re missing the philosophical distinctions we need. It’s important to realise that free speech was valued philosophically by the Founding Fathers at a time when we had an abundance of attention. But [something] we never thought about is [that] there is a finite number of years to hear everything. The premise that the solution to bad speech is more speech depends on the idea that there was more attention, and that the viewer had the patience to see the counter-narrative.
And who do we trust to adjudicate what should and shouldn’t be amplified or said? One of the ironies here is that social media has had an accelerating effect on the delegitimisation of our sensemaking institutions.
The second thing I want to say about this is the distinction between freedom of speech and freedom to reach. We all have, according to the U.S. Constitution, at least a right to freedom of speech. But we don’t have a God given right to stadium-sized audiences of sixty thousand people. That’s nowhere in the Constitution.
[Yet] that’s a subtle point of Facebook’s business model, the business model of the attention economy — it’s more successful at getting our attention when they give each of us a narcissistically bigger and bigger audience that we can reach, because then we’re more and more addicted to using those platforms.
And in fact, it’s dangerous to reach hundreds of thousands of people if there’s no ethics or no control or no notion of what is an ethical, decent, responsible way to contribute to the information environment. Any of those rules that have been hard won lessons for people in journalism and media ethics, those go out the window when you suddenly have legions of teenagers with Instagram accounts that reach fifteen thousand people each.
David Fuller: I understand the issue with the Russian narrative. I just see so many people when they talk about these problems often that they’re also making signalling moves to other people within [Jordan Hall’s] Blue Church paradigm or the mainstream media paradigm. And there’s just no communication across these divides.
Which is ironically a side effect of the polarising processes of social media. I think the important thing to realise is that we are in a full force, unconventional narrative warfare environment.
The challenge of what’s happened with social platforms is that they’ve taken over — they are the new infrastructure. So, our physical world, our world of atoms, is all protected under our national boundaries. If Russia or China tried to fly a plane into the United States, the Department of Defence is going to shoot it down and make sure that that never happens. But when the digital eats the physical and you move up away from the physical to the virtual — and Russia is trying to fly a plane into Facebook — they’re not met by the of Defence.
Yet we think we’re in some kind of stalemate or a Cold War. But it’s not the case at all. We have no digital borders. While we’ve been so paranoid about our physical borders and building the wall, we’ve left the digital borders wide open and the United States is actually not protecting its infrastructure very well at all.
David Fuller: I wonder if you have any thoughts about what can be done about this spiral? I’d like to throw in an example about the creation of the BBC, which was an arm’s length way for the government to manage new technologies [radio and television]. Because they feared that there was a level of coordination and propaganda that wasn’t possible before.
Absolutely. We make this metaphor link all the time. The question is, how do you get back to a publicly interested kind of BBC public broadcasting type model, but more of a public social media? That’s close to some of the work that we’re doing at the Centre for Humane Technology, which is trying to move ourselves off bad incentives.
The other problem here is just the problem of runaway predatory capitalism. There are all these side effects from the privatisation of human attention as a commodity as opposed as something sacred. Now, that’s not saying let’s dismantle all of capitalism — that’s just saying, where is the boundary between making sure we have a sustainable non-self terminating civilization?
David Fuller: You mentioned in the conversation with Tim Ferriss, that we’ve tied business success to capturing human beings. Addiction is basically the model that we’re running on at the moment. Can you just unpack that?
This is a game of power. And when the game of power is happening in the psychological domain, that game is happening over a substrate of human attention. And in that world, addiction is going to be more profitable than a non addiction based product.
How do you solve a problem like that? You have either regulation, or an Apple or Google who are actually the invisible regulators of the attention economy. We need to change what we’re competing for. Which is very much like with our system of markets. We don’t want to be competing just for extraction and demand and profits. We want to be competing for social outcomes, things that make our society better. That reify soulfulness and conviviality and the kinds of things that make life worth living.
David Fuller: I find the framing around the sacredness of attention really, really powerful.
It is. And it’s funny you’re mentioning it, because it’s it’s hard for people to value it to that extreme. If we were ever to reclassify it as sacred and not something that we extract, not something to be commodified, not something that we treat as dead, slabs of manipulational potential, that means reversing out of a whole bunch of GDP. [Yet] that doesn’t change the fact that attention is sacred because it’s the foundation of choice.
But if we really need to make new choices and put new choices on life’s menu, then we need to be able to have a basis of attention, a basis of consciousness, that allows us to do something different or new, to think something new that we weren’t thinking before. And we have a system right now with social media that broadly narrows the space of human sense making and choice by reinforcing the old biases.
I think of it like the birth of the Environmental Protection Agency or the National Park Service. Can we have an Attention Protection Agency? We need to protect attention and treat it as sacred. And that’s part of the work that we’re trying to do.
Rebel Wisdom is the cutting edge of a new counterculture: www.rebelwisdom.co.uk
Other films you might enjoy: