When I met Mark Zuckerberg back in 2008, Facebook was only four years old and far from the behemoth it turned into in the years that followed. Mark was visiting Berlin to stake out the options for entering the German market dominated by a local player by the name of studiVZ.
Facebook had launched its News Feed feature to an outcry over privacy concerns two years before, touted the “most inglorious launch moment in history” by its chief of product Chris Cox. Users hated this new feature which combined all updates from their friends into one stream of updates instead of having to visit all their Facebook pages individually to stay in the loop of what they were up to. It should become the de facto standard of every social platform that came after it and fundamentally change the way we use the internet.
One year after that, Facebook launched its Beacon feature to a user response of similar disastrous dimensions. Beacon enabled partner sites to share data on user activities such as purchases on Amazon with Facebook and publish this information in their profiles. Users hated it. While not officially discontinued at the time we met in 2008, it was obvious that Beacon was dead in the water while News Feed had turned into a runaway success, in spite of both product launches having been met with a similar response by the users.
Would Facebook have listened to what its users were saying, it would have pulled the plug on both the News Feed and Beacon. I wanted to know from Mark how he made the decision to plough through the user resentment on the News Feed while discontinuing Beacon.
Actions speak louder than words
His answer is one of the key lessons every innovator should take to heart: “Don’t just listen to what users say they do, but look at the data of what they are actually doing.” Its something most first time founders get wrong and what prevents them from really innovating and scaling, as the users’ imagination is limited by what they know. To really come up with anything original, you have to closely observe the users’ actions (instead of their words). Then you have to use your imagination to come up with something new that might help the users in ways they would never have thought of before. In order to do that, you have to be prepared to be wrong. Therefore, this observation turned into Facebook’s key mantra: “Move fast and break things!” — and measure and fine-tune the outcome afterwards.
“You also have to unlearn the habit of listening to everything your users tell you because that will drive you crazy and can destroy your business. […] you have to be selective in the user feedback that you take into account and incorporate.” Tim Ferriss in 10 Commandments for Startup Success
By not listening to what users were saying and looking at the numbers instead, Facebook had identified, that News Feed made the engagement numbers go through the roof, while Beacon didn’t do a thing.
Unintended consequences of engagement
This focus on engagement, moving fast and breaking things all the time and adjusting the course after the fact, has made Facebook the breakout success it is today. If the last two years are any indication, though, this single-minded focus turned into Facebook’s downfall as well.
The reason is very simple: There are a lot of things we get very engaged with that are not in our own best interest and therefore not sustainable over time:
- Filter bubble: Cognitive science will tell us, that we prefer not to be confronted with opposing views of the world. Algorithms geared towards user engagement therefore increasingly only showed us things in the News Feed in accordance with our world view, expressed by us clicking and liking it. This resulted in filter bubbles preventing us to be confronted with opposing opinions, further narrowing our world view.
- Algorithmic radicalization: The more extreme content gets, the more attention it draws. Algorithms geared towards maximizing engagement therefore not only drove us down rabbit holes with increasingly extreme and conspiratorial content but incentivized bad actors economically to create and provide this type of content, no matter how true or untrue, fostering toxic tribalism and divide.
“Social media has been the “fertilizer” of democratic collapse in the Philippines, which should be a cautionary tale for the United States.” Maria Ressa
- Addiction: It’s well understood by now, that habits are being formed by offering a reward upon an action that follows a certain trigger. If the reward varies and therefore entails some form of surprise and if some sort of investment is involved, the habit formed gets even stronger and can become an outright addiction. This mechanism at play is the cornerstone for the success of slot-machines as well as drugs — and is carefully engineered for stickiness (read: engagement/addiction) by the most successful apps and services: A push-notification on my phone triggers me to look at the screen, presenting me with a little red bubble at the app that has something new for me. Aside from looking at the screen, the action I take is clicking on the app icon, rewarding me with the display of a Like or a message from a friend. Building up my network of friends, filling my profile with all kinds of information about me and publishing the latest photo of my cat, represents an investment of time in the respective app, increasing the likelihood of me coming back again. This loop often becomes internalized, that over time no external trigger is required anymore for me to take action. Experiments by psychologists at California State University Dominguez Hills have shown the effect technology has on our anxiety levels: When you put down your phone, your brain signals your adrenal gland to produce a burst of a hormone called cortisol, which has an evolutionary purpose. Cortisol triggers a fight-or-flight response to danger. It makes you anxious, because there might be a message, a comment or a Like from a friend waiting for you. Your goal is to get rid of that anxiety so you check in (action) at least every 15 minutes even without an external trigger, to receive a release of dopamine (reward) as a pleasurable experience. Former Google Engineer Tristan Harris calls smartphones weapons of mass-manipulation and ex-Facebook president Sean Parker criticized the platform for exploiting this human vulnerability — which turned him into a billionaire along the way.
“[There are] unintended consequences of a network when it grows to a billion or 2 billion people and … it literally changes your relationship with society, with each other … It probably interferes with productivity in weird ways. God only knows what it’s doing to our children’s brains. You’re exploiting a vulnerability in human psychology. The inventors, creators — it’s me, it’s Mark [Zuckerberg], it’s Kevin Systrom on Instagram, it’s all of these people — understood this consciously. And we did it anyway.” Sean Parker
The unintended consequences of exploiting this vulnerability are well-documented by now: The apps that are most addictive and therefore are used most intensely every day are also the ones that are associated with the biggest level of unhappiness after usage, and apps like Facebook and Instagram are high up there.
Ample research leads to the conclusion, that this is not just a correlation, but an actual causality (i.e. not just more unhappy people using these kinds of apps, but these apps actually having a negative effect on the happiness of their users).
“All of our minds can be hijacked. Our choices are not as free as we think they are.” Tristan Harris
“Imagine that a psychologist embarks on a study of happiness among drug users. He polls them and finds that they declare, every single one of them, that they are only happy when they shoot up. Would the psychologist publish a paper declaring that heroin is the key to happiness?” Yuval Noah Harari in Sapiens: A Brief History of Humankind
Business the only business of business vs. Making the world a better place
Facebook is by no means the only culprit here, actively leveraging or passively benefitting from all these angles to drive engagement and user growth. The ad-driven business model sets clear incentives for creating more engagement by all means — which turn into more ad-dollars along the way. Facebook’s long history of scandals and apology after apology followed by proceeding without actual change in behavior, however, leaves the company particularly exposed to the current tech-backlash.
“The ugly truth is that we believe in connecting people so deeply that anything that allows us to connect more people more often is *de facto* good. That’s why all the work we do in growth is justified. All the questionable contact importing practices. All the subtle language that helps people stay searchable by friends. All of the work we do to bring more communication in. The work we will likely have to do in China some day. All of it.” Andrew “Boz” Bosworth, Facebook VP
Proclaiming to be a purpose-driven company with the goal of “Making the world a better place” while optimizing solely towards engagement (i.e. profits) with the opposite effect, is deceptive at best. When pursuing engagement without compromise, you end up compromising on ethical choices.
When pursuing engagement without compromise, you end up compromising on ethical choices.
A recent article in the New York Times titled “Delay, Deny, Deflect” demonstrates, that Facebook went much further than that: When suspicious Russian activity during the 2016 US presidential elections raised the alarm inside of the company, Facebook delayed, denied and deflected the influence of these revelations. Mark Zuckerberg called the notion that fake news on Facebook might have influenced the outcome of the election “a pretty crazy idea”, referring to a very small volume of activity of 3,000 ads connected to Russian agents. Only after being pressured continuously, Facebook revised its public statement twice, finally acknowledging that close to 126 million people had seen the Russian posts. Mark Zuckerberg apologized for ridiculing fears over Facebook’s effects on the election and promised to do better in the future. Revelations about the missuse of user data by Cambridge Analytica, its role in the Rohingya genocide in Myanmar and in becoming a tool for oppression in countries of the Arab Spring, the Philippines and inciting violence in many more demonstrate, that the company hasn’t lived up to its promise. By employing a Republican opposition-research firm to discredit activist protesters, in part by linking them to the liberal financier George Soros and tapping its business relationships, lobbying a Jewish civil rights group to cast some criticism of the company as anti-Semitic as documented by the New York Times piece, raises the question, whether Facebook actually intends to do better or just tries to sweep the problems under the rug. By responding to these allegations in the same way as titled (“Delay, Deny, Deflect”), only to acknowledge them to be true in a press release conveniently timed on Thanksgiving Eve for minimum exposure, clearly indicates, that Facebook intends to do the latter.
Momentum in reverse
I’ve seen people complain about Facebook and proclaim its imminent demise for many years now, only to see them go back to Facebook to post about it. I’ve heard many times that everyone in my home country Germany is so concerned about their privacy, without them actually changing their privacy settings, because convenience always wins out. A seemingly endless cascade of privacy scandals has proven Facebook’s secret for success right over and over again: Let people talk, we look at the engagement numbers, and they are pointing upwards no matter what they say. Even after Russian election meddling on Facebook and the Cambridge Analytica scandal that provided access to millions of users’ data without their consent, Facebook’s usage increased.
“They are businesspeople, they are good businesspeople. They just represent a set of business practices, principles and ethics, and policies that I don’t necessarily agree with.” Brian Acton, Cofounder WhatsApp, on Facebook’s management
I do believe it’s a different story this time around and this very secret of Facebook’s growth in the vacuum of any moral burdens weighting the growth story down is broken for good. When the executives are dishonest and reward those loyal to them, there are going to be consequences. Falling out with Facebook’s disregard for its privacy promises made, the founders of Facebook’s major acquisitions WhatsApp and Instagram have left the company, in Brian Acton’s case even forfeiting $850 million in earn-outs just to be able to get out as fast as possible. Brian, who co-founded WhatsApp, even called out the company on his way out with #DeleteFacebook on Twitter.
Internal surveys show that employee morale at Facebook is plummeting along with its share price: According to a report in the Wall Street Journal, just over half of Facebook employees (52 percent) have said they were optimistic for the future of the social networking platform — down by 32 percent last year. Only 53 percent of Facebook employees said the company was making the world better which is 19 percent lower than last year. With high-paying alternatives just around the corner, money is not the major motivator for the best and brightest if they can’t get behind your vision, morals and purpose. This is reflected by anecdotal evidence from university graduates, who report that the stigma of working for Facebook began outweighing the financial benefits.
With user growth flat or even negative in the markets that account for 72 percent of Facebook’s ad-revenue (US, Canada, Europe), a changing climate seems to finally manifest itself in engagement numbers as well. While 25 percent of US users report to have uninstalled the Facebook app from their phones, a full 44 percent of users aged 18–29 have done so.
Actions speak louder than words (full circle)
I’m absolutely certain that Mark didn’t set out to actually break things as bad as Facebook turned out doing and that he actually had good intentions. As the road to hell is paved with good intentions, however, he needs to be measured by the actual outcomes, just like the success of an innovation — as Mark Zuckerberg certainly knows — is not measured by what users say they do, but by what they actually end up doing. The response to the turmoil of the last two years by him and the top management at Facebook as documented by the New York Times is appalling and by no means better than the conduct of the old-world-Enrons Silicon Valley was supposed to replace.