Governing in the Age of Surveillance Capitalism
INTRODUCTION: ZERO-DAY EXPLOITS
The world has changed more than we recognise.
The conjunction of pervasive surveillance, platform capitalism and artificial intelligence have pushed the entire world into a sort of ‘post-real’ era.
The mirror of the post-real is ‘fake news’ — which seems to be anything we don’t want to hear.
We live inside systems that edit what we see and hear of the world around us, ‘curating’ the overwhelming flow of human sharing into something that allows us to maintain the pretence of stability.
We have quickly lost sight that these systems are entirely artificial.
They’re curating us into an imaginary world where everyone agrees with us, sees the same things as us, likes the same things as us, and knows the same truths as us.
That was never true, but now it feels true.
In a world where there are a few billion voices simultaneously clamoring to be heard, we feel an instinctive need to be connected and understood.
Even if that is entirely an illusion.
We need to start here because it is important to recognise that the ground we stand on is, in 2017, almost entirely made-up.
That’s made it difficult to reach consensus, because consensus first requires an agreement on what is real.
“Everyone is entitled to their own opinion, but they are not entitled to their own facts.” That’s what US politician Daniel Patrick Moynihan said, forty-five years ago.
Facebook presents a counter-argument: Everyone is now entitled to their own facts.
That’s become the day-to-day reality of our world, and its politics.
It’s not clear how we govern in the post-real.
So that’s what we’ll focus on: this problem is already here, and it’s not going away.
Over the next few years, nearly all questions of governance will emerge from misunderstandings in the post-real political landscape.
We’ll take two tours through that landscape — telling the same story, but in different ways.
We’ll begin with the ideal case.
DAY ONE: GRAVE NEW WORLD
Your alarm clock goes off at 7:13 AM, and you can’t dwaddle in bed — otherwise the cup of coffee that’s been brewed to perfection will get too cold.
You shower and dress and as you walk into your kitchen the screen shows you a livestream of the morning news, a pastiche of video coverage perfectly illuminating all of the interesting aspects of your world.
There’s a touch of international news, national news, local news — and a heartwarming human interest story.
It all leaves you feeling very satisfied as you grab your satchel, open the door, and find a vehicle at the curb, waiting for you to climb inside for the quick trip to the office.
The car has no driver, of course, and you’re surrounded by thousands of other driverless vehicles, each weaving through the peak hour traffic in a ballet of machine coordination.
There are no stop signs anymore, nor any stop lights. The cars know what they’re doing and they do it so well, as everyone wakes on time and departs for work on time and it all goes perfectly to plan, every morning, rain or shine.
So good: when you get to the office, the elevator is there, waiting for you, to take you up to your office on the 27th floor!
After the pleasantries about the weekend — ‘did you catch the match on Saturday night?’ — and the normal emails, read aloud, answered by voice, and efficiently dispatched — the first meeting of the work week: a department secretary, looking for some help: Vaccination rates are down among the youth in particular communities — affluent, well-educated and well-connected.
In a burst of nativism, the anti-vaccination movement, which took hold in the early days of social media, has occasional flare-ups.
This one could precipitate a nation-wide outbreak of measles.
If that happened, tens of thousands would fall ill, and hundreds would die — needlessly.
No one wants that.
But the department secretary is at a loss — how do we reach these women (and they’re nearly all women) who have decided to withhold vaccination from their children.
They can’t be lured in by economic means.
Denying benefits means less to them than their right to keep their children protected.
So what do you do?
You know the answer to that, and lay it all out for the secretary.
First, you do a run analysis of the all of the posts within these communities.
Patterns of public sharing immediately illuminate the members of a network, and particularly highlight the members who are key influencers.
Those key influencers are about to run into a bit of luck, with amazing opportunities opening up to them, keeping them so busy that they naturally lean back from a full-throated contribution to these networks.
This weakens the links within these these networks — without inspiring any resistance.
For the rest of the network, intensive emotional profiling.
You point out to the secretary that we can, through an analysis of shared content, know exactly how anyone is feeling on any given day — whether happy or say, whether optimistic or pessimistic.
And, more than that, you say, we can actually nudge their feelings.
How’s that? Asks the secretary.
Well, we can use that emotional content to determine when they’re most open to particular kinds of messaging.
On a day when someone is feeling a bit down, you might use that as an opening to place a bit of paid content into their news feed, something that might cause them to question the safety of their position.
You do it gently, slowly, over a period of weeks and months.
And you do the opposite: when someone is feeling good, you feed them stories that help them associate that feeling with following doctor’s advice.
That compliance is part of feeling good.
They don’t see it consciously, but within two months, the associations are there.
Vaccine fears are quelled, and the feeling of uncertainty associated with having an unvaccinated child is amplified.
That’s when you reach out to them directly, with a fantastic offer.
They’re a winner now.
And while they’re feeling like a winner, the doctor’s office messages to remind about scheduled vaccinations.
She agrees. Problem solved.
That sounds ideal, says the secretary. How do we begin?
You see the secretary off with a handshake. Your teams will negotiate the specifics. And you smile — because you did a great job.
You really love your work. Today your work made the world a better place.
The rest of the day is a blur of meetings, bringing staff up to speed on this new client.
Everyone knows what to do: the data scientists get their databases spinning, the ‘emotional engineers’ break out their dashboards and social graph analysis tools, while the message engineers set to work handcrafting a few news stories — heartbreaking accounts of deaths from whooping cough and measles — that will be dropped at just the right moment, into selected news feeds.
It’s all going to plan.
DAY TWO: HE MANAGED BIG BROTHER
Your alarm clock goes off at 6:45 am. Earlier than needed — probably.
But you can never be sure — so you dash through your shower, dressing and breakfast, grab your bag, and head out to the stop at the end of the street.
And wait — for what seems hours — until a share vehicle arrives.
You only have a fifteen second window to get into the vehicle before it departs.
You’ve already missed it once this week — it never arrives at exactly the same time, and always arrives without warning — so you’ve taken the safe approach.
You can’t really afford any more demerits at work.
Traffic sucks, as always, the constant stop-and-go of vehicle battling vehicle through traffic that always seems to be getting worse, no matter how many restrictions on drivers or share vehicles get imposed.
But at least you make it to your office block with a few minutes to spare, just long enough to queue for a coffee downstairs before heading up to your 29th floor offices.
The barriers admit you, but you see one of your co-workers, outside the barriers, dissolved in tears.
She looks at you, and in a glance, you know what’s happened: Too many demerits.
She’s on suspension, possibly even fired. Maybe she missed her ride too many times this week.
The elevator is quiet on the ride up, as everyone quietly ponders their own precarity.
The morning starts with the obligatory water-cooler chatter about who’s not at their desks this morning, and whether that’s illness — or something more terminal.
Several folks haven’t been carrying their weight, and that sort of thing gets noticed, the demerits build up, and, well, things take their natural course, don’t they?
Then there’s one or two new faces from the temp agency, bursting with pride at being given such an incredible opportunity to move out of the pool of fully replaceable labour into something that feels more permanent. Something with room to grow.
You wonder if they can handle the pace.
You’ve learned not many can — and that reminds you there’s still plenty to do before your first meeting.
A trove of emails from superiors to be actioned, plenty from subordinates looking for direction, and some customers to be managed, all if it a continuous stream of heavy work until the Big Meeting of the day, with a department secretary, looking for some help: agitation rates are way up among young adults in affluent, well-educated and well-connected communities.
The anti-growth movement, which took hold in the climate crash of the early 2020s, has slowly gained strength, and it’s beginning to organise successful boycotts against businesses.
They’ve been small ones — but they show the growing potential of the movement.
They could really affect a national economy just finding its footing after the worst slump in half a century. And that, the department secretary intimates, can’t be allowed.
Is there anything you can do for us, the secretary asks.
Absolutely, you answer. The key is disruption — break up the networks.
But it has to be done carefully, because if we’ve learned anything over the last years, it’s that hacking a network to bits is exactly the wrong way to go about things.
They tend to move into the dark web. Makes them that much harder to track.
Here’s where you show your worth: You type a few commands into the keyboard, and the various national datasets appear, social media activity, cross-referenced demographic and census data, plus profiling and mobile tracking…
Within a few minutes, a wide-eyed department secretary stares at a map of Melbourne’s richest suburbs, showing the ‘hot spots’ of agitation.
That’s them identified, you say.
Now we merely make things harder there.
Just little edges. Delays. Infractions. Difficulties. Nothing anyone can point anything to — not directly.
But enough to start to chip away at both their confidence and their demerits.
We already know that people who get close to the red line with demerits become vastly more compliant.
So we produce conditions that help them fall just far enough that they get a real sense of the abyss.
Some will fall through. That’s always going to happen.
But then, when that happens, they fall into the well-managed arms of the state.
For the rest — after a bit of a scare — an offer.
An opportunity, a first step on the rung.
We’ll wait until they’re at their weakest moments of self-doubt — we can track that, in real time — and use that as a trigger.
They’ll take the deal. They always do.
The department secretary smiles, satisfied. No more needs to be said.
The rest of the day is a blur of meetings, bringing staff up to speed.
Everyone knows what to do: the data scientists get their databases spinning, the ‘emotional engineers’ break out their dashboards and social graph analysis tools, while the gamification folks develop a system of nearly invisible but pervasive punishments — that will be dropped on these agitators, at just the right moment.
It had all better go to plan. You can’t afford any more demerits.
DAY THREE: SURVEILLANCE UTOPIANISM
The second of these versions of the future (‘He Managed Big Brother’) belongs squarely in the emerging trend of ‘surveillance authoritarianism’.
Yet it’s a soft authoritarianism.
There’s nothing obviously ugly about it, just the kind of continuous grinding down that keeps a population broadly compliant with the dictates of the state.
That’s one version of the future.
Only it’s not the future, it’s the present.
For the last year it’s been reported that the Chinese government is developing what they’re calling a ‘social credit’ system.
Every citizen will be issued a ‘rating’ that will be drawn from and generated by their participation in civil society.
Every microblogging post to Weibo, every link shared, every website visited, every block walked, every purchase made, every hour worked, all of it will eventually factor into an individual’s social credit.
Most of that infrastructure for surveillance is already well in place.
The rise of smartphones, and the deep collaboration of the ‘big three’ — that’s Baidu, Alibaba and Tencent, over in China — mean that the Chinese government has all of the tools they need to gather data on their citizens continuously, and can adjust their social credit ratings in real time.
While it’s not clear what it will mean to have a high rating, it’s already been made clear that individuals with low social credit rating will find it hard to find jobs or housing or educational opportunities.
Effectively, the entire apparatus of the Chinese state will turn its back on these citizens, until they mend their ways.
Again, this doesn’t feel much like the authoritarianism we’ve seen in Xi’s China, where dissidents get kidnapped, disappear for months to years, then reappear confessing before televised show trial audiences.
And for that reason, it’s deceiving.
Because we can’t see the whip hand, we also can’t see how fear of that invisible whip shapes our actions — or, in this case, the actions of the Chinese public.
Yet in so many ways even this soft authoritarianism is mired in the views of the 20th century, a Stalinist repression of any dissent, which, if history is any guide, inevitably breeds increasingly effective resistance.
Orwell laughed at the futility of that resistance in Nineteen Eighty-Four, and it remains an open question whether the modern state, equipped with the full range of surveillance and psychological monitoring techniques, can evade that fate forever.
But surveillance authoritarianism tempts fate, and that reason alone is enough to avoid it.
The first example (‘Grave New World’) illustrates what I am now calling ‘surveillance utopianism’.
That’s quite close to the world we’re in already: where Alexa can answer every question and Uber service every need for transportation, where sensors summon elevators and where the vast resources of surveillance capitalism — the profiling and machine learning resources of Facebook and Google and Amazon — make the world our oyster.
That world seems quite wonderful, but surveillance utopianism comes with its own cost: The individuals within it are known so completely that they can be manipulated thoroughly.
Again, this is not a hypothetical from a world that does not yet exist.
On the first of May, The Australian reported that Facebook had shown its Australian clients a slide deck that offered the capacity to be able to reach teenagers at their most vulnerable moments.
Through analysis of their social sharing activities, Facebook could let an advertiser know just when a purchase would do the most to boost the confidence of these teenagers.
Facebook denied it, apologised for it — but the report had been authored by the head of Facebook’s Australian operations.
That’s how they operate. Facebook are in the business of emotional monitoring and emotional manipulation.
This profiling is continuous, ubiquitous and pervasive. There’s almost nowhere anyone can go where we’re out of view of these systems, nor any activity we can partake in online that is not carefully analysed.
This analysis generates an ever-more-realistic behavioural model for each of us, and that model is then used to select the best, most effective moments to deliver specific messages.
Mostly those are consumer offers, but — as we’re now learning happened with Brexit — they have been used by foreign powers to propagate fake news reports through communities.
This gentle and loving surveillance utopianism operates by narrowing the space of possibility, hemming the individual in by constraining what they see and how they come to see it.
Since so much of what we believe we know about the world comes to us through Facebook — users spend an average of fifty minutes a day looking at their news feeds — it’s an easy and pleasant mechanism of control.
And here we are, caught between the Scylla of surveillance authoritarianism on one hand and the Charybdis of surveillance utopianism on the other. Both are already real. Both have been or are being deployed right now.
And both will be offered to as policy tools.
Here’s the thing you learn about the future — it’s already here, it’s just not evenly distributed.
There are bits popping up in China and a few more here in Australia and they show us that this future is going to be everywhere and a part of everything and these will be the tools that will be offered up as indispensably useful elements of public policy.
PART FOUR: ONYA
People will be coming government departments and department secretaries, offering them incredible tools that will bestow incredible powers.
These tools will be incredibly tempting.
They will solve so many problems.
They will help manage the public.
And you will be presented with a choice: does government use those tools?
Does government use pervasive surveillance as an essential instrument of social regulation?
And if so, where do they use it, when, and to what end?
But before that, there’s another question, an important question, the question that we seem never to ask, but which we must ask insistently, before the use of these tools becomes standard operating procedure: should government use these tools at all?
Let’s take a look at three lenses through which these tools — and our choices around them— can be seen.
It’s thought that these tools for mass manipulation work because people are unaware of them.
I don’t know that this is provably true — people seem to continue to use Facebook no matter what they learn about it. So there’s that.
Yet that doesn’t imply a blank cheque for any kind of monitoring or manipulation.
Where people’s activities are being monitored by their governments, their governments face a choice about whether to inform the people of that monitoring.
A lot of that informing and permissioning is hidden away deep in the terms-and-conditions pages where we invariably click “OK”.
Is that informed consent?
Is that transparency?
In the raw legal sense, perhaps, but in the spirit of the law, absolutely not.
These profiling organisations do everything in their power to disguise their own activities in order to hide the amount of power they have already amassed.
They are anti-transparent, and if governments use their services to further their own ends, they are also being anti-transparent.
Lack of transparency is the necessary precondition for surveillance utopianism.
Conversely, transparency — at least with respect to surveillance — is a necessary precondition for surveillance authoritarianism.
Tell people they’re being watched all the time everywhere and their behavior will change. They will self-censor.
All of which is to say that there’s no perfectly right answer here.
Hide the truth and manipulate; share the truth, and manipulate.
The only way one evades becoming a manipulator is by refusing to use these tools.
And is that a realistic option?
There don’t seem to be any easy choices here, simply dilemmas we’ll need to face.
If the future is going to be manipulative — if there’s no easy way to evade using these tools — then at least we must guarantee that those who use them will be able to be held fully accountable for the consequences of their use.
This is really just another type of transparency — the transparency of power.
And power hates being transparent.
It’s so much easier when you can carry on your affairs unobserved and undisturbed.
But in democracies that choice is not on offer.
When the decision is made to use these tools, they must be used within supporting frameworks that provide oversight and accountability in their use.
Without those frameworks, there are no mechanisms to regulate or control their incredible capacities to manipulate the public.
That’s something that will never end well — neither for public servants nor for the government.
Accountability is essential risk management.
But to get to accountability, we need supporting frameworks. And we don’t have many of these.
Various privacy statutes can be used as a basis, and whatever protocols exist for data sharing and data security, but none of this touches on the ‘weaponisation of influence’ that is the basic capacity of these new tools.
We need to have a deep think about what it means to know the public and individual members of the public so well, via their profiles, that they become wholly manipulable.
Public servants have never had that power before, so they’ve never had a need to manage that kind of power.
Until the public service feels confident that it can manage that power well, it can not offer those powers to the government.
Again, that puts the government at too great a risk.
This is the time to start developing those frameworks.
How does the public service hold itself accountable for the manipulation of people?
And what happens when things go wrong? (Because things will always go wrong.)
This is not a theoretical conversation.
It’s already possible develop a campaign of manipulation that would help achieve a department’s policy goals.
But what are the rules?
What are the frameworks?
Where is the risk?
Those questions must be answered — and soon.
Because these tools are only growing more capable.
One of the uncomfortable truths of this moment is that almost all civic dialog now takes place in a space that’s wholly owned by a single company — Facebook.
I don’t presume any malevolent intent on their part; but they are a commercial organisation, and where public interest conflicts with private gain, there are always going to be problems.
Some of the problems we have today arise from Facebook’s role as the default tool for social and civic collaboration.
Given the nature of these tools — designed for surveillance, monitoring, and manipulation — they can not create trustworthy civic spaces.
People need to feel they can have an open conversation in an environment where they feel free to share, converse, and build consensus.
If we’ve learned anything about Facebook, it is that — by giving everyone exactly what they want to see — it has proven fundamentally corrosive to consensus.
We can ask Facebook to change its ways, but as they’ve built a business model on “increasing user engagement“, that’s not going to be easy for them.
Meanwhile, all of the other tools for community building — which were thriving before Facebook — have atrophied.
We’ve forgotten how to forge communities — both online and in the real world.
This represents both an immediate problem and a unique opportunity for all levels of government.
We need to ask ourselves what kind of civic culture we want, then build the tools to support that culture.
And we can’t do this alone.
That’s not the job of one department or even the whole of government.
That’s an exercise in nation-building that quite literally involves the whole nation.
And it’s not just digital.
It’s about forging the space for conversation — where people who have very different views can work to reach consensus.
That’s a lot easier to do when you’re looking someone in the eye then when you’re tapping away at a keyboard.
All these digital toys are pretty, but they’re insufficient — and sometimes they’re actually harmful.
We’ve told ourselves that good government requires technology.
In reality, good government requires transparency, accountability, and collaboration.
Those three goals are what we will need to keep front of mind amidst a rising Siren’s song that promises capacities for control we’ve never dreamed of before.
Capacities that are already here.
Capacities that are already on offer.
Big thanks to Zeynep Tufekci — whose work on surveillance capitalism has been deeply influential.
More thanks to John Robb, who understands networks better than most.
And finally, thanks to the crew at Civic Hall, who posed some great questions that I’ve reframed here.
I have a lot more to say about Facebook and the post-real in my essay The Last Days of Reality — published in the Summer issue of Meanjin.