Photo credit: MattysFlicks via Foter.com / CC BY

Project Brainwash

Facebook, free will, and the alarming new advertising frontier

Kimberly K.
Bullshit.IST
Published in
7 min readFeb 27, 2017

--

Think about the decisions you made in the last week. How many of them were based entirely upon your own free will? The last soda you drank, burger you ate, or laundry detergent you bought — was it you who made that decision? Was it influenced by any outside forces like advertising or the media? You might think the rationale was entirely your own, but the truth is trickier than that.

Humans like to believe that they form opinions and make choices of their own free will. They also like to believe they’re impervious to outside forces like advertising and media campaigns. However, studies show that people are actually far more suggestible than they’d like to think, especially when advertising appeals to their emotions or preconceived biases.

In advertising we trust

Suggestibility begins with trust. In 2013, consumer research company Nielsen reported that consumer trust in advertising was growing, rather than shrinking. Trust ranged anywhere from 42% for online banner ads to 84% for personal recommendations. And nearly half (48%) of all people trusted search engine advertisements, online videos, and social media ads. If you trust an advertisement, you’re more likely to believe it’s true — that the source is credible. You’re also more likely to be convinced.

But even if someone doesn’t trust advertisements, they’re still influenced by them. Although only 42% of people trust online banner ads, a Harvard study revealed that these still have a “significant impact on search queries, clicks on searches and thus purchases”. Even more interestingly, this effect wasn’t immediate — it took two to four weeks for advertising influence to be noticeable in consumer behaviors. Although the study doesn’t explain this time lag, it’s interesting to consider why this happens. Two to four weeks is long enough to forget seeing an advertisement, but not so long that its influence has faded. Individuals exposed to online advertising may think they’re making an independent choice, but their behaviors have still been altered.

If we’re influenced by even the most trivial online banner ads without necessarily being aware, what happens when advertising knows you better than you know yourself? Can it change your behavior? What about your beliefs?

Profiling the populace

Yesterday, The Observer published an in-depth, investigative piece on Robert Mercer, a secretive, billionaire computer scientist who funds a variety of different political causes. He’s poured over $10M into Breitbart, empowering it to become the platform of the alt-right, and over $45M into other pet political causes, including an organization specifically aimed at painting the mainstream media as untrustworthy.

But, more importantly, he has a financial stake in Cambridge Analytica, a small data analytics company that uses artificial intelligence and psychological profiles to target consumers with extremely subversive advertising.

Cambridge Analytica has “psychological profiles based on 5,000 separate pieces of data on 220 million American voters.” Instead of profiling voters in large swathes by gender or race, they can drill down to the most granular level: the individual. American voters were in Cambridge Analytica’s sights during the Trump campaign, which poured over $150M into Facebook and social media advertising. And Cambridge Analytica doesn’t stop at US politics: many believe it was pivotal in swaying public opinion on Brexit as part of the “Leave” campaign, shorthand for the faction that wished to exit the EU.

“At Cambridge, we were able to form a model to predict the personality of every single adult in the United States of America.” — Alexander Nix, Cambridge Analytica CEO

It’s not the first time Cambridge Analytica has been in the news. In January 2017, The Data That Turned the World Upside Down took the internet by storm. It was the first time such media and advertising campaigns were so extensively investigated, and for many readers it was a shocking revelation just how much simple Facebook “likes” could reveal:

In 2012, [Cambridge University researcher] Kosinski proved that on the basis of an average of 68 Facebook “likes” by a user, it was possible to predict their skin color (with 95 percent accuracy), their sexual orientation (88 percent accuracy), and their affiliation to the Democratic or Republican party (85 percent). But it didn’t stop there. Intelligence, religious affiliation, as well as alcohol, cigarette and drug use, could all be determined. From the data it was even possible to deduce whether someone’s parents were divorced.

Between artificial intelligence, Facebook “likes”, social networks, and internet-wide tracking, Cambridge Analytica’s models “never stop learning and never stop monitoring”. With thousands of data points and hundreds of millions of user profiles, they’re able to scientifically determine what advertisements are most likely to convince you.

And, perhaps most alarmingly, a computer model that once knew nothing about you suddenly knows, well, practically everything.

Big data brainwashing

But how is this possible? Even with all this information, how can an advertisement “know” how to convince you? By combining artificial intelligence with Facebook likes, it’s possible to have a scarily high level of accuracy when it comes to understanding what makes people tick:

Before long, [Kosinski] was able to evaluate a person better than the average work colleague, merely on the basis of ten Facebook “likes.” Seventy “likes” were enough to outdo what a person’s friends knew, 150 what their parents knew, and 300 “likes” what their partner knew. More “likes” could even surpass what a person thought they knew about themselves.

Not only this, but such models track a person’s behavior around the internet and learn everything they need to know about personality and habits. If you’ve ever Googled only to later see relevant ads appear on Facebook, imagine something even more sophisticated and you’ll have grasped a tiny piece of the puzzle.

Nielsen’s research from 2013 already suggests that nearly 1 in 2 people trust social media ads. Combine this implicit trust with appeals to political beliefs, emotional hot-button issues, or pre-conceived biases, and suddenly an advertiser has found a way to gain credibility, trigger a response, and shift your psyche. On Facebook you’re not only a captive audience, but an impressionable, trusting, data-rich one.

Once you’ve been profiled, their model will tailor and test advertisements on you and your social network until it finds exactly the right one to make you comment, “like”, or click. Worse still, “knowing” your friends allows the advertising to spread throughout your social network like a propaganda virus.

Testing different advertisements or even user interfaces is nothing new. In the tech industry, particularly among startups, A/B testing — trialing different versions of the same experience — is a common practice. Multiple versions of the same UI are shipped and data is collected to determine which is most successful: how many people clicked on the button in A vs. B? Which design earned the most money? It’s Darwinistic design — the most successful user interface is the one that survives.

And the same can be done for advertising. After determining your psychological profile, different advertisements are tested against you until you respond. Your profile is updated accordingly. The model is updated. More advertisements are tested. The cycle continues.

“We have profiled the personality of every adult in the United States of America — 220 million people.” — Alexander Nix, Cambridge Analytica CEO

The advertiser may even be able to observe when you’ve seen an advertisement but haven’t engaged with it — when you’re just an “impression”. This tells them what isn’t working. Once you’ve reacted (or not), that information goes into the machine learning model, which incrementally improves. Now it can better tailor its advertisements to you, people like you, and people who are different from you, too.

But does personality targeting actually work? Kosinski has new data suggesting so:

[Kosinski] has conducted a series of tests, which will soon be published. The initial results are alarming: The study shows the effectiveness of personality targeting by showing that marketers can attract up to 63 percent more clicks and up to 1,400 more conversions in real-life advertising campaigns on Facebook when matching products and marketing messages to consumers’ personality characteristics.

Professor Jonathan Rust, also from the Psychometric Centre, elaborated on why this is so alarming:

“The danger of not having regulation around the sort of data you can get from Facebook and elsewhere is clear. With this, a computer can actually do psychology, it can predict and potentially control human behaviour. It’s what the scientologists try to do but much more powerful. It’s how you brainwash someone. It’s incredibly dangerous.

It’s no exaggeration to say that minds can be changed. Behaviour can be predicted and controlled. I find it incredibly scary. I really do. Because nobody has really followed through on the possible consequences of all this. People don’t know it’s happening to them. Their attitudes are being changed behind their backs.”

When the computer scientists working on artificial intelligence are describing their own models as “incredibly scary”, that’s when klaxons should begin sounding. We’ve entered a dark and dangerous new age: one where advertising agencies are not simply throwing up dumb, flashy banner advertisements promising “one weird trick to eliminate belly fat”, but where they know us better than we know ourselves.

Their goals? Convincing us — or at least some of us — that the mainstream media is always untrustworthy; telling us not to bother voting or who to vote for instead; insisting we leave the EU, even when we don’t actually know what that means; and whatever other goals or ideologies they wish to push. And they’re playing the long game — shifting public opinion takes time.

“Pretty much every message that Trump put out was data-driven.” — Alexander Nix, Cambridge Analytica CEO

In this new age, advertising isn’t just trying to sell products— it’s selling ideas and ideologies. Moreover, these shadow-puppeteers’ goals aren’t fleeting — they’re trying to fundamentally change us: politically, psychologically, and morally. They’re trying to reach inside our minds and permanently shift our values to align with their goals, and they’re doing it in a way that’s more compelling, subversive, and pervasive than ever before. If that isn’t sinister, I don’t know what is. But perhaps the most pernicious part of all is that we still think we’re the ones in control, even when we aren’t.

Kimberly is a writer, photographer, and former technologist. She’s currently questioning her Facebook “likes”, aka contributions to a personality profile within the shadowy AI advertising machine. The rest of the time you’ll find her traveling or eating.

Leave a tip ❤ | FB | Instagram | Twitter | Plethora-Etc.com

--

--

Kimberly K.
Bullshit.IST

Lead Content Strategist @ ZEN / Technologist & Program Manager / VRARA Blockchain Co-Chair / Formerly @ Microsoft