Kicking Facebook, Part 3

How Facebook drives engagement, captures attention, and collects data, and what they do with it all.

Scott Matter
11 min readJan 13, 2018

Facebook — and, to be fair, lots of other companies — uses a number of techniques to trick you into behaviours that are good for them, and neutral or negative for you. (If you’re looking for an excellent, in-depth article about this, featuring former tech insiders who have become “Silicon Valley heretics” in refusing to even use the products they helped to build, this piece by Paul Lewis is solid gold.)

Variable reward — why the news feed is addictive

One of the most common techniques relies on a principle called “variable reward.” It’s the cornerstone of principles taught by Nir Eyal, a prominent “behavioural designer,” entrepreneur, and angel investor. Here’s how he explains the value of using variable reward in digital design:

“As B.F. Skinner discovered over 50 years ago, variable rewards are a powerful inducement to creating compulsions. Today, technology companies are creating new habits by continuously cycling users through the Hook Model — and variable rewards fuel the chain reaction. Understanding what moves us to action allows us to build products that are aligned with users’ interests and gain greater control of our own technology-induced behaviors.”

— Nir Eyal, https://www.nirandfar.com/2012/03/want-to-hook-your-users-drive-them-crazy.html

In Facebook, the most obvious example of this is in the news feed on your phone, which is designed much like a slot machine (also designed to be addictive). You can’t see what’s coming next, but you think it might be something really good, because you’ve seen something really good in the past. So you pull down to refresh or flick your thumb to scroll and wait (sometimes for an eternity of seconds) for the next dopamine hit.

Notifications work in a similar way. Every time you open that app, you can bet there’s a little red badge with a number in it, telling you there’s something new to look at. You keep coming back, looking for some new stuff. You keep scrolling, and scrolling, and scrolling, and scrolling until you’ve missed your bus stop and have to backtrack a few kilometers to get home. (Full disclosure, this happened to me a few weeks ago!)

This is not your choice. You are being manipulated, by design, to keep you scrolling through the newsfeed, whether it’s good for you or not.

Why? The simple answer is that the more you scroll, the more ads you will be shown. I did a quick experiment just now. I counted the first 100 things I saw in the newsfeed to see how many were posts from real people or organisations I follow, and how many were ads. The result, 86 real posts, 14 ads. That means 1 in 7 things in my feed were paid for by someone trying to reach me — or people like me. Imagine if every seventh interaction you had with someone in real life was interrupted by an ad.

Social validation — why sharing and reacting are addictive

Facebook has also honed its ability to prey on our narcissism and need for social validation. As a social species, we have an inherent need to be liked and accepted as members of a group. This works through a feedback mechanism — we do or say something, other people react to it, we interpret the reactions to learn if what we’ve said is acceptable and makes us look good in the eyes of others. It generally happens at a subconscious level, and it leads to all kinds of things like compliance, conformity, and internalization.

Source, “Alias Noa”, Wikimedia Commons

What Facebook has done is made it incredibly easy to communicate behaviours and responses over time and distance. In the process, Facebook has made itself the broker of social validation, which draws more people into sharing more of their lives on the platform. The mechanism here is the like button and reactions, and it adds social validation to the principle of variable reward.

Think of a time — any time — you’ve posted something on Facebook. Maybe it was a photo of your new haircut, maybe it was an article you read and agreed with. You’ll probably recall the slight, subtle feeling of anxiety, the rush of anticipation, and the wave of relief as the first reactions come in. Maybe that was followed by a hint of disappointment when a few minutes or hours go by and not as many people reacted as you expected (and hoped).

When you post something on Facebook you don’t know whether or how many responses you’re going to get — variable reward is in play here too. But you do know, even if it’s subconscious, that people’s reactions (or lack of reactions) will tell you whether what you’re doing is worthwhile or whether you’re wasting your time in life, whether people like you as a human being or whether no one cares about you so you might as well just … . It works in the same way for Instagram, as Adam Alter explains in this video.

The uncertainty and social validation keep us coming back to Facebook to post and react to more and more content. It feels good, we crave it, it’s instantaneous and easy. But it’s junk. We get the dopamine hit, but we don’t get the satisfaction signals that helps us with impulse control and allow us to moderate our own behaviour. Designing digital solutions for that problem is not in the interests of Facebook, because it would limit your engagement and their supply of attention to sell.

Encouraged/forced connection — building networks to collect your data

In addition to manipulative design that uses psychological vulnerabilities to keep us hooked, Facebook uses the power of technology to pull us deeper into its ever-growing web. Over the years, the company has either built new services or bought other companies that have built them, to get more complete access to our digital lives.

One example of this is sharing and uploading contacts between digital accounts. In the early days, you used your email address to create and verify a Facebook account, and it asked you to share your contact list. The helpful service would then find people you knew on Facebook and suggest you connect with them or send them invitations to join if they weren’t already there.

The Facebook app seamlessly integrates with the contacts app on your iPhone, if you’ve given it permission. If you haven’t given it permission, every time you add a new contact to your phone you’ll be asked to change your settings and allow contact sharing.

Every. Single. Time. I add a contact to my phone.

The design trick here is to make integration the default setting, and to pester users to restore the function if they turn it off. It’s a piece of software, it will never get tired of making the same request — but you might wear down and give in.

For Facebook users, contact integration is a handy feature — it’s really easy to find people you know on Facebook if Facebook already knows who you know in real life, and obviously you’ll want to be Facebook friends with anyone in your phone contacts list, because, well, it’s Facebook and practically everyone in the world is there.

The benefit here for Facebook is that it grows its user base and likely increases the number of connections in its network. The more people Facebook can connect you with, the more opportunities they’ll have to keep you interested and coming back to Facebook. It also gives Facebook ways to integrate with other datasets out there in the world, in order to identify and track you around the web and in the real world.

Massive databases of people’s identities and contact information are just one of the benefits of Facebook building other apps like Messenger, buying Instagram and WhatsApp, and integrating with Twitter. It also works through “social logins” — using your Facebook account to create and login to other accounts around the internet. When you integrate apps and share contacts, you are implicitly giving Facebook (and other companies) permission to watch you do the things you do online. This is ethically dubious, at best, and a massive violation of your privacy and autonomy at worst.

Where in the world is… location tracking

One of the wonders of modern technology is that with a simple, ubiquitous handheld device, we can navigate the world and broadcast our location to anyone. Facebook uses location technology to make suggestions about what to do where you are, and it lets us “check in” to show off the great things we’re doing with our time. You get the reward of some social validation when people react with approval and envy, and maybe even a discount on your drinks or meal from the owner of the business where you’ve checked in.

Unsolicited, automated, location-based information sharing.

On a recent trip to Tasmania, I noticed that Facebook helpfully offered to show me where my friends had checked-in on their trips to Tasmania. All I had done was to be tagged by my partner in her check-in at a cafe in Hobart, and Facebook knew where I was and what information to show me. While this is fun and maybe even useful, it’s also completely creepy, not least because of what else Facebook is doing with all the data they take from us.

What is Facebook doing with all the data and technology?

Facebook has been incredibly successful at hooking a huge proportion of the human population and capturing unprecedented amounts of our attention. This has made it one of the biggest, most successful companies of all time, with absurd revenues derived almost completely from advertising sales (nearly $30 billion in 2016 and growing).

Looking at recent numbers, in the USA and Canada, Facebook brings in nearly $20 per user per year by selling advertisers our attention and using our personal data to do it.

The value Facebook provides to advertisers is not just the number of people it can reach. The real value is in the ability to use the incredible amount of data the company collects on every one of its users to carefully target ads to the most receptive audiences.

This has some benefits for everyone — it’s basically a waste of everyone’s time and money to show ads for pregnancy tests to people who can’t get pregnant, so it works for everyone if Facebook can make sure that only women likely to be considering or trying to conceive are shown ads for home pregnancy tests. If you’re a woman of a certain age, you may have noticed pregnancy test ads following you around the internet, in part because Facebook (and google) tracks your behaviours and serves you ads all over the place.

This power can also be used for real harm. As recently as November, 2017, Facebook continued to allow advertisers to purchase targeted ads that exclude ethnic or racial groups. ProPublica ran an experiment and investigation in which they were able to break US Federal anti-discrimination laws by targeting housing ads to exclude “groups like African-Americans, Jews, and Spanish speakers.” Not only is this illegal, but Facebook was made aware of the problem and hadn’t done anything to change it after a full year.

Similarly, “bad actors” can pay to use Facebook’s data and targeting services to carefully distribute practically any kind of information they want to pretty much any group of Facebook users they want.

This is the power and the technology behind the “election meddling” that likely contributed to Donald Trump becoming President of the United States, to British voters choosing to leave the European Union, and which is likely to be used again in the future to disrupt and influence democratic processes wherever it is convenient for the people who want to do it. This is all laid out in an excellent, very long story by Roger McNamee, an early Facebook investor and former mentor to Mark Zuckerberg.

Facebook offers more than just targeted advertising. Combining big data, analytics, and location services, it also offers advertisers the ability to measure how many people visited their stores or bought something from them after seeing an ad for their company’s products. This works both online, by tracking people’s behaviour in and outside of Facebook, and in real life.

Facebook offers advertisers tools to tailor advertising to people with specific characteristics and to direct them to the closest brick-and-mortar store to buy the product shown in the ad, which they hope will increase the number of people who buy something after being prompted by an ad.

If that’s not enough, Facebook also uses location services to measure how many people actually visit a store after seeing an ad, to prove that the ads are effective and make them worth more money to advertisers.

Not enough for you yet? How about the “offline conversions API” that “allows businesses to match transaction data from their customer database or point-of-sale system to adverts reporting, helping them better understand the effectiveness of their adverts in real time.”

Yes, that means that Facebook will use the data it has about you, including your physical location at specific times, and match that with a company’s records of what was bought, when, by whom, to tell them how much return on their advertising they have made.

And the future looks to be getting creepier. Late last year reports surfaced that Facebook has filed a patent for facial recognition technology that may allow in-store sales people to access data about the individuals walking through their doors, and “gauge customer’s emotions and brand choice by leveraging their Facebook profiles through crown-scanning technology.” All this ostensibly in order to provide a hyper-personalised experience.

We all have different tolerances for creepiness, and some people may be more comfortable than I am with advertising, privacy violations, personal security risks, and the prospect of fictitiously familiar sales-people knowing things about me before I’ve opened my mouth. But this is a dark future we’re heading towards.

This is dystopia. Facebook’s dopamine engine is pleasuring us into total surveillance and shattering our autonomy and self-control.

I’m not suggesting that we’re heading for a world where Facebook is the Matrix or Skynet (well, not yet). But we are already living in a world where Facebook and other massive tech companies have insinuated themselves into the fabric of our society in countless ways. Not only do they manage the flows of information between people in minute detail, they are extracting the data from those interactions to insert and target advertising throughout our lives, and to shape our behaviours without our consent or knowledge.

Imagine a world where your telephone conversation about what movie to see with your friends is interrupted by telemarketers with a sales pitch for the latest summer blockbuster. Or a world where your dinner was paused on the fork between plate and mouth to serve you an ad for a desert you don’t have in the house at the moment but can buy at the supermarket down the road. That’s where we’re heading, because Facebook is well on its way to becoming the internet itself, capturing as many moments of our attention as possible to turn into revenue from advertising sales.

The more time you spend and the more data you let Facebook take from you, the less control you will have over your decisions about what to buy, what to do, who to spend your time with, and who to vote for. It needs to stop.

This is part 3 of a 4-part series.

Read Part 4: What can we do? Quit Facebook, reclaim your autonomy, and rebuild the social fabric.

Read the whole shebang, in one, nearly 5,000 word essay: Kicking Facebook

--

--

Scott Matter

Anthropologist (PhD, McGill 2011), strategic + service designer, small axe. Fascinated by complexity, collaboration, and change.