Wylie’s War

Marcia Stepanek
7 min readAug 1, 2022

--

I remember seeing people who had been targeted with misinformation, and who had been sort of massaged online into believing certain kinds of conspiracies, and how angry these people were and how they started to engage in highly racialized thinking. To see the rage in their eyes and the faces and what that looks like—and to see what a manipulated person starts to look like? For me, it was really eye-opening. …Stories today are not just being used to distort reality, but to create new ones.

Christopher Wylie, social researcher, data scientist and whistleblower

Since interviewing Cambridge Analytica whistleblower Christopher Wylie on stage at the high-powered Future of Storytelling Summit in New York in 2018, it’s become even more clear that what he shared that day didn’t simply expose the disinformation campaign former Trump advisor Steve Bannon conjured up to help Donald Trump get elected to the White House in 2016.

The prediction Wylie made in my interview with him at FoST — that the cultural and political effects of Bannon’s manipulations would continue to weaken and challenge our democracy, itself, for years to come — proved to be both prescient and spot-on.

The 2017 Charlottesville “Unite the Right” rally supporting Trump by white supremacists chanting “Jews will not replace us” and then, later, “Blacks will not replace us” was one of the first televised outcomes of Bannon’s campaign, which had already begun to take on a momentum of its own.

Bannon’s psychological division-making via social media to stoke anger and whip up anti-democratic sentiment across the United States was just getting started. Consider the racist violence that has ensued since, and the mass shootings by white supremacist targeting immigrants, Black and Jewish churchgoers, school children of color, Black grocery shoppers, Black men and boys—and mostly all with suffocating chokeholds or military assault weapons purchased legally without background checks or queries of intent by American teen-aged boys who just turned 18.

Fast forward to the January 6th, 2021, attack on the U.S. Capitol, and Trump’s forced statement made hours after the armed mob’s deadly assault began that included Trump’s statement of “love” for those who rioted on his behalf, regardless.

Wylie, the whistleblower, who was Cambridge Analytica’s research director when Bannon started work there, too, was, nonetheless, both a witness and a partially unwitting co-creator of the digital tools Bannon would eventually use to wage a war on our democracy from within.

What follows is a short, edited transcript of my 90-minute, 2018 interview with Wylie, shared again here in the wake of Bannon’s July 22, 2022 conviction surfacing from investigations of the January 6, 2021 attack on the U.S. Capitol.

MARCIA STEPANEK: In March 2018, Christopher, you blew the whistle on Cambridge Analytica, that political consultancy that worked for the Trump campaign. The company had illegally obtained the Facebook information of 87 million people and used it to build psycholgical profiles of voters to spread narratives on social media aiming to ignite a culture war, suppress black voter turnout and exacerbate racist views held by some white voters. Trump has denied this. Do I have this description right?

CHRISTOPHER WYLIE: I’m not exactly sure it’s 87 million people, I don’t know an exact number, but that sounds about right.

(Audience laughter.)

STEPANEK: Okay. You left Cambridge Analytica in 2014, before the systems you designed to help route out terrorists began being used by Bannon to find white Americans who felt marginalized in our democracy. Bannon was, essentially, using these tools to identify people prone to racial and religious bias via social media to help Donald Trump get elected in 2016.

WYLIE: Yeah, my job had been, essentially, to help the U.S. and British military unveil any ISIS sympathizers as part of their intelligence campaign to route out potential terrorists post 9/11.

STEPANEK: When and how did things then change?

WYLIE: I was Cambridge Analytica’s research director at the time when Steve Bannon began working there and I watched as Bannon and his group began using data drawn from a number of sources, including Facebook, as a way to begin targeting people for disinformation campaigns.

STEPANEK: How did this work exactly?

WYLIE: They were finding people who had felt left out of the mainstream, and with Facebook groups and Meet-ups, convinced them to stop being quiet about how they felt neglected and ignored. It was, essentially, getting marginalized people to become angry so that Bannon and his group could begin to manipulate them, in effect, as a certain segment of the American voter population.

(Bannon and his group) began using data from Facebook and other online sources to target people who were more prone to conspiratorial thinking, mostly people susceptible to racist thinking and conspiracy theories. They used that data and they used social media more broadly, first to identify those people and then to engage those people and really began to craft what, in my view, was an insurgency in the United States.

STEPANEK: You say insurgency, meaning what?

WYLIE: When Steve Bannon got introduced to the company, he realized that a lot of that work to stop terrorism could be inverted— meaning that, rather than trying to mitigate an extremist insurgency in certain parts of the world, he wanted to essentially catalyze one in the United States.

…Originally, when I began working (at Cambridge Analytica) as a researcher, we were looking at this technology for defense purposes, to protect democracy. We wanted to figure out what were the psychological characteristics of those people who would make them more prone and more vulnerable to certain kinds of extremist messaging so we could engage them beforehand, to avoid civil unrest.

This was all based on research that came out of the University of Cambridge that looked at how, particularly with Facebook data, you could quite accurately predict a person’s personality profile. If you could understand how a person thinks and feels and engages in the world, and what kinds of biases they have, then you can then figure out what’s going to be most effective at engaging them in a particular objective — originally in some kind of counter-extremism or mitigation strategy. These would be people who were more prone to conspiratorial thinking or paranoid ideation. Bannon and his group, effectively, were looking for the same kinds of people. But rather than discouraging them from joining ISIS, it now became their work to encourage these people to join the alt-right.

STEPANEK: How was social media used as an accelerant in this?

WYLIE: What actually happened was that Facebook authorized the applications that Cambridge Analytica ended up using to access the Facebook data. The company then engaged professors at Cambridge University to create an application that then got put onto Facebook where people would go and fill out personality inventories, like surveys, about who they are and their attributes. But the way the app worked was that they wouldn’t just harvest the data of the person who responded to that survey. It would go into their profile and look at all of their friends and harvest all of their friend’s data, as well.

STEPANEK: So when one person filled out a survey, by default, they effectively consented by proxy for hundreds of other people—simply because they were Facebook friends with them, right? This must have scaled very quickly.

WYLIE: Yeah, [Facebook has] since turned it off and rightfully so, but at the time, you could acquire a lot of data really quickly.

MARCIA: What do you remember most about that time that still keeps you up at night?

WYLIE: I remember seeing videos of people from focus groups and events that Cambridge Analytica was doing with people who had been targeted and who were sort of massaged online into believing certain kinds of conspiracies. And I remember how angry these people were, and how they started to engage in highly racialized thinking.

To see, like, the rage in their eyes and to see their faces and what that looks like what a manipulated person starts to look like? For for me, it was really eye-opening.

STEPANEK: So how much of this kind of disinformation method created by Bannon is still going on? Or will it manifest in more political, racial, religious and economic dissension via our politics to come?

WYLIE: Cambridge Analytica no longer exists but some of its former employees are currently set to be working on the next Trump campaign. The things that I started working there to do was to help the US military capture potential terrorists after 9/11. But then, after Bannon got involved, I found myself no longer working on the defense of our democracies. I became a whistleblower because of the role the company was taking to get Trump elected and Brexit to succeed in England — was basically then being used in a new way, to destabilize democracy.

Basically, the technology I had been working on had been completely inverted to really, in my view, be an attack on our democracies. And these techniques didn’t die with the company. They are still being used to sow division and attacks on democracy, itself, by angry segments of the population who believe the misinformation they were fed years ago and that continues to be stoked by Trump.”

STEPANEK: In your forthcoming book about all of this, Mindf*ck: Cambridge Analytica and the Plot to Break America, you will be detailing these techniques, but what made you quit? I mean, what did you consider the last straw before you left?

WYLIE: One of the reasons I’m writing the book is to serve as a warning, particularly to Americans. We have a completely un-regulated digital landscape. There is still almost no oversight. We have been placing blind trust in companies like Facebook to do the honorable and decent thing. …

Even if Cambridge Analytica doesn’t exist anymore, what happens when China becomes the next Cambridge Analytica? What happens when another politician decides to use the same methods Bannon did?”

--

--