“Facebook is fucked”: closing the barn door after the elections were stolen

Jon Pincus
A Change Is Coming
Published in
9 min readMar 17, 2018
Facebook’s logo with a red circle and slash around it.

The New York Times’ How Trump Consultants Exploited the Facebook Data of Millions, along with the Guardian’s coverage including ‘I made Steve Bannon’s psychological warfare tool’ (an amazing profile of whistleblower Chris Wylie), explores in detail how SCL/Cambridge Analytica — funded by Robert Mercer, with Steve Bannon as a board member — harvested private information from the Facebook profiles of more than 50 million users without their permission, and then used it to help the Trump and Brexit campaigns.

These allegations aren’t new. Cambridge Analytica, the shady data firm that might be a key Trump-Russia link, explained (from last October in Vox) has a good overview of CA’s role in the US election, and The great British Brexit robbery (from last May in the Guardian) looks at the UK. Until now, though, Facebook’s line has always been “nothing to see here, move along”. Just last month, their UK Director of Policy assured a Select Committee, “They may have lots of data but it will not be Facebook user data. It may be data about people who are on Facebook that they have gathered themselves, but it is not data that we have provided.”

That changed just a few hours before the New York Times article appeared, when Facebook admitted that things haven’t actually been copacetic after all. From their post:

In 2015, we learned that a psychology professor at the University of Cambridge named Dr. Aleksandr Kogan lied to us and violated our Platform Policies by passing data from an app that was using Facebook Login to SCL/Cambridge Analytica, a firm that does political, government and military work around the globe. He also passed that data to Christopher Wylie of Eunoia Technologies, Inc….

It wasn’t just the people who downloaded the app who were affected; the app also got data from their friends whose privacy settings allowed it (which was most of them, thanks to Facebook’s long-documented history of making information available by default and not making it easy to change). As a result, SCL/CA got information from for over 50 million people.

It’s all okay, though. Facebook makes app writers promise not to misuse the information Facebook gives them — and promised not to share it with other people. So what could possibly go wrong?

Well, for one thing some naughty app developers might break their promises and develop malicious apps. I know, I know, hard to imagine! Apparently Facebook wasn’t particularly concerned about this, because they didn’t build in any safeguards at all.

Move fast and break things!

Unsurprisingly, some app developers did develop malicious apps — like the aforementioned Dr. Kogan and Joseph Chancellor, founders of the company called Global Science Research (GSR). [Kogan also worked for St. Petersburg University.] Their app claimed to be for an academic research study but they were actually passing the data to Cambridge Analytica. In late 2015, after the Guardian reported that Cambridge Analytica was using private Facebook data on the Cruz campaign, Facebook promised to “carefully investigate this situation.” Facebook also hired Chancellor, and he apparently still works there.

In August 2016 Facebook told Cambridge Analytica that they were in violation of the terms of service and should delete the data. Since Cambridge Analytica had already shown that they’d break the rules, you’d think that Facebook would have checked to see if they actually did delete it, but according to Chris Wylie, “literally all I had to do was tick a box and sign it and send it back, and that was it.” Once again, what could possibly go wrong?

So now, Facebook is shocked, shocked to receive reports that (gasp) “contrary to the certifications we were given, not all data was deleted.” But it wasn’t a data breach! And it wasn’t their fault!!!!!!

The claim that this is a data breach is completely false. Aleksandr Kogan requested and gained access to information from users who chose to sign up to his app, and everyone involved gave their consent. People knowingly provided their information …

Really? I really doubt that most of the people downloading the app, after being told it was a research project, were told the information would be used by a corporation to influence election results. I’m 100% sure that their 50,000,000 friends didn’t even have the opportunity to consent. As Zyenep Tufecki says in Facebook’s Surveillance Machine, “This wasn’t informed consent. This was the exploitation of user data and user trust.” It’s almost like Facebook’s blaming the victims and trying to turn the focus away from their own culpability.

Similarly,

We are moving aggressively to determine the accuracy of these claims. If true, this is another unacceptable violation of trust and the commitments they made.

Who could have predicted that people who violated trust and the commitments they made would do so again? It reminds me of the classic line from Animal House: “You fucked up. You trusted us.”

Anyhow, Facebook assures us that they are “committed to vigorously enforcing our policies to protect people’s information”. So they’re going to close the barn door.

We are suspending SCL/Cambridge Analytica, Wylie and Kogan from Facebook, pending further information.

That’ll show ’em!

Good question. I mean, Facebook’s terms of service don’t permit it, so doing that would be an unacceptable violation of trust and the commitments Cambridge Analytica made. That hasn’t stopped them in the past, but who knows, maybe third time’s a charm? The again, maybe not … Wylie, who should know, says “rules don’t matter to them”.*

There’s a lot of debate about just how effective Cambridge Analytica’s technology actually is; there’s certainly an element of snake oil in the way they talk about it. But as Emily Gorcenski says, the power is in the data. If the Facebook engineers who worked closely with CA and the Trump campaign with were even minimally competent, the models built from this psychographic data (combined with all the other sources they have access to like consumer histories, lifestyle information, census returns and historical voting records) is likely to have given them a significant advantage in Facebook ad targeting — which in turn translates into a huge cost advantage. And it’s easy to see how this data could help a campaign take advantage of the research Facebook published on how to use Facebook to manipulate emotions and influence voter turnout; remember how the Trump campaign boasted about its voter suppression campaigns before the election?

For close elections like Brexit and the US Presidential election, this could have been a decisive edge.**

Mark Zuckerberg ended his “reflective” post-election post in November 2016 by saying “In my experience, people are good, and even if you may not feel that way today, believing in people leads to better results over the long term.” Move fast and break democracy!

Yeah really. Zuckerberg obviously didn’t understand how callous that sounded to immigrants who are being deported and fear being kidnapped on the streets, Muslims who are banned from visiting the US, trans people who have lost their right to medical care, the targets of the upsurge in neo-Nazi and alt-right violence, and all the other people who are directly impacted by the result of the election. Privileged much?

Then again, I can see why Zuckerberg would say that. Remember back in 2007 when he was apologizing for Facebook violating people’s privacy with Beacon and talked about how they had learned that people need to control what they share? There have been a lot more privacy violations, and apologies,*** since then … and yet, he’s now worth about sixty bazillion dollars, and until recently was acting like he was thinking of running for President. So yeah, people believing Facebook and trusting it with their data have certainly led to better results for Zuckerberg.

At least until now.

Even before these latest headlines, it was clear that Facebook was facing some challenges. They’ve just had their first-ever decline in users in the US. Users are spending less time on the site. [Sure, Zuck can claim this as an intentional result of their news feed changes to make it a better experience, but the data shows the decline starting before the changes so he’s pretty obviously trying to spin.] Problems with regulators (like the anti-trust investigation in Germany and the privacy ruling against them in Ireland) and lawmakers in the US, UK, and EU (including a new 3% tax on revenues), plus the new GDPR privacy regulations, add to the pressure.

Now they’re also facing a spectacular and high-profile example of Facebook’s general greed, hubris, and business model based on exploiting their users. As law professor Frank Pasquale says, “The lid is being opened on the black box of Facebook’s data practices, and the picture is not pretty.” Facebook’s initial ham-fisted response — threatening to sue the Guardian for publishing, blaming the users for letting themselves get exploited, and quibbling over whether or not the word “breach” applies**** only made things worse. We Zuckerberg finally released his first statement a few days later, he didn’t even apologize. Brilliant.

So now Facebook’s problems are escalating quickly. The Massachusetts Attorney General has already kicked off an investigation. Facebook may have violated their consent decree with the FTC, which would trigger major fines. Amy Klobuchar and Adam Schiff were the first to calling for Congressional hearings but Republicans quickly joined them. Mark Warner’s calling for regulation. Even Marco Rubio’s talking about companies who “grow so fast, and get so much good press, they get up high on themselves, that they start to think perhaps they’re above the rules that apply to everybody else.” There’s a parliamentary inquiry in the UK. Their stock plunged 9% leading to a suit from investors. Users are suing as well. People are deactivating their accounts. Privacy activists are re-energized. Facebook employees fear their golden years are over.

And that’s all just in the first few days. This story isn’t going away — as Kara Swisher says, it’s going to be a long and winding road.***** With all the attention on Trump/Russia and Brexit, there’s going to be plenty of publicity on this — and plenty of politicians, journalists, lawyers, media companies, and potential competitors eager to jump on the bandwagon of criticizing Facebook. As Facebook’s COO Sheryl Sandberg says

We have a responsibility to protect your data — and if we can’t, then we don’t deserve to serve you.

So let’s hope Marcy Wheeler’s right, and Facebook is well and truly fucked.

Originally written March 17
Updated March 18–25, adding new links, moving info to footnotes to improve flow, and other minor edits.

* A couple more good examples of rules not mattering to Cambridge Analytica: they’re also under scrutiny in the UK for things like pitching an ‘illegal offer targeting overseas donors’ to Leave.EU, and broke US campaign law by having foreign nationals work on political campaigns.

** Some of the others ways Facebook helped Trump:

And don’t even get me started about long-time Facebook investor and board member Peter Thiel.

*** Liz Gannes did a nice history of Facebook apologies back in 2011 for the Wall Street Journal; Harry McCracken just did an updated one after the CA mess.

**** From a liability perspective, Facebook has good reasons for it not to be considered a breach: if it is, they’ve probably broken the law in many of the 48 states that have some kinds of breach notification laws; and EU regulations could expose them to a hefty fine. And from a security perspective, it’s a valid point that the term “breach” is usually used for a hack. Still, it’s not a good look.

***** Kara Swisher and Kurt Wagner’s interview with Zuckerberg in Recode is really outstanding. ”Why is Facebook unable to anticipate these obvious problems?” ”We try?” “Can you get data back?” “We’ll try hard?” There are also good interviews in Wired and the Times but start and end with Kara.

--

--

A Change Is Coming
A Change Is Coming

Published in A Change Is Coming

Software, culture, social computing, diversity, and more

Jon Pincus
Jon Pincus

Written by Jon Pincus

strategist, software engineer, entrepreneur, activist ...

No responses yet