Everything You Need To Know About How Facebook Modifies Behavior and Why It Matters

In Just Under 7 Minutes

matt m
6 min readOct 11, 2019
Photo by Glen Carrie on Unsplash

“Companies earn their profits by exploiting their environment. Mining and oil companies exploit the physical environment; social media companies exploit the social environment. This is particularly nefarious because social media companies influence how people think and behave without them even being aware of it.”

~ George Soros

When users like you, me, and 2.3 billion others comment, post, update, like, share, click, and create, Facebook collects, stores, and algorithmically processes that into a product that makes billions.

Access to that product, the User Data, is sold to customers who wish to influence our future behavior in some way —be that political behavior, consumer behavior, sexual behavior or something else.

They’re not the only one. In Jaron Lanier’s book Ten Arguments For Deleting Your Social Media Accounts Right Now he refers to them as “BUMMER” companies — Behavior of Users Modified and Made into an Empire for Rent — and the mains ones are Facebook, Twitter, and Google.

Clumsy acronyms aside, each shares a business model of offering free services in exchange for the ability to modify user behavior, collect the results, package it up, and sell it to 3rd party customers.

So how does it work and why might it be a problem?

How Facebook Creates & Sells Its Products

Although it’s quite hard to penetrate the inner workings of Facebook itself, the researchers at ShareLab did a powerful series on how it might be done based upon publicly available Facebook patents.

The information for the following three steps is based off that.

#1: Collecting User Data

Facebook collects all behaviors and activities (liking, posting, updating, sharing pics, commenting, creating pages), all account and profile information (passwords, account status, family, friends, birthdays etc.), and laptop and mobile phone info (time zones, device ID’s, GPS location).

It’s collected through Facebook itself, through mobile phone permissions (ok to use GPS, mic etc.), through Facebook-owned companies (Whatsapp, Instagram etc.), through Facebook partners (companies with a special relationship to Facebook), and through cookies that track your movements around the web.

Shoshana Zuboff, in her book Surveillance Capitalism, calls this the “extraction architecture”.

#2: Storing and Processing User Data

Once collected, a Social Graph is created. This meta-structures visualizes how all the nodes and edges connect. How specific users, pictures, and pages connect with other specific users, pictures, and pages — was it through setting up the page or liking it? being in the picture or being outraged by it?

In short, it’s a “global mapping of everybody and how they’re related.”

Social Graph: “global mapping of everybody and how they’re related.”

#3: Selling Targeted Ads

This mapping makes it easy to see User Interests based upon past behavior and match that with a suggested list of Ad Concepts to create ads targeted to specific users.

These ads can be broken up into three categories:

  1. Basic ads that are based upon location, age, gender, and language.
  2. Detailed ads that are based upon demographics, interests, behaviors, and likes.
  3. Connective ads that are based on connections to other Facebook pages, events, and apps.

How Facebook Improve Its Products

In order to increase the amount it can charge for “Access to User Data,” Facebook continually improves each of these three steps.

It improves collection by modifying user behavior so people choose to stay on Facebook and Facebook-owned sites and engage with the design more (click more, post more, share more, etc.).

It improves storing and processing by creating algorithms that are better able to find patterns in data to develop two things:

  • more accurate representations of how users behaved in the past
  • more accurate predictions of how users might behave in the future.

It improves targeted ads by nudging people toward being more algorithmically predictable. If Facebook can increase the likelihood of you going on a diet from 80% to 90%, then they can charge that much more.

How Facebook Modifies Behavior Through Persuasive Technology

Professor B.J. Fogg at Stanford University coined the term “captology” way back in 1996. It stands for “Computers as Persuasive Technologies.” The basic ideas were laid out in his book Persuasive Technology: Using Computers To Change What We Think and Do.

Since then, his influence has grown considerably.

In Roger McNamme’s new book Zucked: Waking Up To The Facebook Catastrophe, he described Fogg’s influence as follows:

“His insight was that computing devices allow programmers to combine psychology and persuasion concepts from the early twentieth century, like propaganda, with techniques from slot machines, like variable rewards, and tie them to the human social need for approval and validation in ways that few users can resist.”

(a brief note: If you don’t have time to read McNamee’s book, then I suggest checking out Sam Harris’s podcast or Douglas Rushkoff’s podcast with McNamme.)

Tristan Harris, the founder of the Center for Humane Technology and a former student of Fogg’s, wrote a great piece on medium about 10 ways that technology hijacks your mind.

Which is to say, how persuasive technology design principles modify behavior to get users to spend more time doing more things on site.

What follows below is based on the two pieces of advice in McNamee’s quote plus some of the hijacks that Harris outlines.

#1: Advice from Psychology and Slot Machines

  • Control the Menu and Choices: (“You can have any color car you want, so long as it’s black”). Choice Architecture.
  • Social Approval: Photo Tagging. The Like Button. Notifications.
  • Social Reciprocity: I tag/friend you, you tag/friend me so that the total network grows.
  • Instant Interruption vs. Respectful Delivery: Facebook Messenger’s design that shows when a user ‘saw’ a message because that sets into motion social reciprocity pressures.
  • Inconvenient Choices: Somer services and apps can be deliberately difficult to cancel.
Photo by Benoit Dare on Unsplash

#2: Advice from Slot Machines

  • Intermittent Variable Rewards: like slot machines “unpredictable, variable rewards stimulate behavior addiction” because they tap into that part of us that loves to figure out patterns. We can’t figure out the pattern yet, but we will keep playing until we do.
  • FOMO: Fear of Missing Out on something important keeps you coming back.
  • Auto-play and Endless Feeds: the inability to scroll to the bottom of a feed “eliminate cues to stop.” (brief idea — why is it called a “News Feed” and not a “News Digest”? The name itself implies compulsive eating.)

Behavior modification isn’t necessarily a problem in and of itself. It’s when we combine behavior modification through persuasive technology with the economic imperatives Facebook and BUMMER companies are under that brings the problems.

As Jaron Lanier writes:

“The problem is relentless, robotic, ultimately meaningless behavior modification in the service of unseen manipulators and uncaring algorithms.”

Ok, Our Behavior Is Modified. So What?

The behavior modification described above is sort of amoral. It simply makes sense for a company that runs on an advertising business model. However, because of this, regardless of the personal intentions of employees, there is a tremendous amount of collateral damage and side effects.

This includes the problem that many of the algorithms seem to be optimized for outrage, ergo it’s more likely to turn users into jerks than pro-social friends (we see this everyone online). Then there’s the problem of fake news and fake social pressure created by fake bots persuading real public opinion.

But for now, I’d like to focus on the problem of the future. Of what kind of space it inhabits in our lives and what behavior modification might do to it.

Media theorist Douglas Rushkoff is far more articulate on this point, so I’ll leave it with him. After talking about how algorithms can predict with about 80% accuracy how likely we are to go on a diet, purchase something, or consider a change in sexual orientation, he writes:

“We are using algorithms to eliminate that 20 percent: the anomalous behaviors that keep people unpredictable, weird, and diverse. And the less variation among us — the less variety of strategies and tactics — the less resilient and sustainable we are as a species. Survivability aside, we’re also less interesting, less colorful, and less human. Our irregular edges are being filed off.”

Until next time, keep those edges from being filed off.

--

--

matt m

Intellectual wanderlust from a nomadic book fiend. From the USA. Based elsewhere. Something new every…time I get around to writing something new.