Facebook, Cambridge Analytica, and Other Weakest Links

The many failures by many players throughout the app ecosystem

Annotote is a better way to get informed and inform others

Of course the age of abundant information is prone to beg, borrow, and steal your attention. Reading blogs, news, and research has always been an inefficient user experience — finding needles in haystacks. But now, Annotote is the antidote, check it out: Don’t waste time or attention; get straight to the point.

What follows isn’t exonerating Facebook of its responsibilities and culpability with regard to the ongoing Cambridge Analytica scandal, but it is suggesting that there were failures in a multiple, contingent, crucial — albeit implicit — backstops. Others, like Google and Apple, also play a role in the governance of the app ecosystem, and understanding not just the plumbing therein, but also roles and responsibilities, is fundamental vis-a-vis shoring-up the system. Bear with me…

Structural assurance and assurers

The heart-of-the-issue here is Facebook’s inability to structurally secure user data once it was in the hands of 3rd party developers who were building apps on its platform. Once a 3rd party app had Facebook user data, said 3rd party could do technically whatever he/she pleased with it. Granted, nefarious use of this data violated Facebook’s Terms of Service — starting with Kogan’s redistribution of the data unto a separate, commercial entity, Cambridge Analytica — but nothing structural provided assurance against such malfeasance. (The contractual assurance leveraged by Facebook’s Social Graph API depends on reactionary ex post facto discovery of wrongdoing, whereas structural assurance like tokenization, encryption, and anonymization is actively preventative.)

Now, some of these 3rd party apps building on Facebook’s platform and tapping into Facebook’s data were traditional, standalone apps (e.g. AirBnB). Others were apps-within-an-app, a la Tencent mini-programs, living and operating within Facebook proper (e.g. Professor Kogan’s personality quiz app that was the vehicle to Cambridge Analytica’s ploy). Whether we’re talking about an independent app or one incorporated within Facebook core:

Doesn’t Apple have a role in policing the apps on its iOS platform? (Similarly, Google via Android?)
Is Facebook’s relationship with Kogan’s app, “thisismydigitallife”, substantially different than Apple’s with Facebook?

For what it’s worth, Google’s responsibility here (via Google Play) is perhaps of even greater concern than Apple’s (via App Store). Since we’re seeing the shake-out from Android’s hands-off approach already, I’m going to dwell on Apple, in particular, because there has been no spotlight shined in their direction thus far.

Apple has a heavy touch in regulating its iOS App Store — with the explicit intention of rooting-out these kinds of vulnerabilities. For example, as an app on Apple’s platform, Facebook can tap-into my address book via my iOS Contacts; and by virtue of granting Facebook that native access, Apple dictates Facebook’s permissions for its use of my data. Using App Store inclusion as its stick, Apple literally audits app code and maintains app standards — kind of like a de facto regulator-of-first-and-last-resort — providing users a form of that aforementioned “structural assurance”.

Here’s a bit about Apple’s App Store approval process (emphasis mine):

“After [2009], apps were rejected that did not provide more robust user experience beyond simply using location data for advertising. Other requirements, such as using undocumented APIs, have always led to rejection… guidelines are put in place to prevent problems with pornography, violence, legal issues, user experience, and other more specific guidelines in apps. Apple checks each app against these guidelines before approving it for sale and inclusion on the App Store.”

The code that enabled this user data overreach, Facebook’s “Social Graph API”, was an API (as its name would suggest), which means it was under the purview of App Store guidelines, per this excerpt. Furthermore, since Facebook proper wasn’t rejected by the App Store, we’re safe to assume it was a documented API, which means it was also laid bare for App Store review.

So, sure, Facebook should have had the intuition to see how bad actors could have gamed Social Graph API… but so should have Apple! This isn’t a legal matter for either company, but this is a matter of moral equivalence and calibrating public outrage:

Why was Facebook’s violation of its implicit obligation to its users — a failure of foresight — so egregious, but Apple’s warrants a free-pass?

This is the power of the narrative.

Benefit of the doubt

Now, I know we should all temper this with a healthy dose of reality. For example, ‘curation and regulation are hard’; ‘Apple disclaims that it isn’t responsible for 3rd party app malfeasance’; or ‘Apple could never kick Facebook out of the App Store’. (That last one really sounds like regulatory capture, but I digress.)

Listen, I’m sympathetic. Apple assumed the difficult — but necessary — task of curating its app ecosystem. That’s an impossible undertaking that nobody could perform flawlessly. Stuff is going to fall through the cracks, and Apple’s going to get blamed for it. Such is the plight of the supermassive content aggregator, although the spoils are immense.

In such a way, Apple has been the lynchpin of the app economy. Its generally benevolent dictatorship, coupled with the system’s balance between radical demand-side economics and mutually assured destruction, has helped enable big tech to be left to its own means of self-regulation. That sounds like the paragon of capitalism — an incentive structure that governs itself via market-based solutions. And, the benefits of that autonomy — both gross and net — have been substantial.

Apple really does a remarkable job handling this remarkable chore. Without Apple’s oversight, any developer could get you to download any two-bit app, then steal far more valuable user data from your iPhone than Kogan abetted here.

Regardless, Apple failed to enforce its own guidelines upon a very visible API launch for a very major app (i.e. Facebook) on its very own App Store platform.

But, you can’t be half pregnant

Again, this isn’t intended to minimize Facebook’s failures or amplify Apple’s. This is intended to level what happened with why it happened, such that it might never happen again. In theory, we’d expect any app to make every effort, from first principles, to maximize cybersecurity, privacy, user protections, etc. In reality, some don’t/won’t. Some are intentionally nefarious and others unintentionally so. By virtue of being a massive, complex black box, Facebook was unintentionally susceptible — perhaps even recklessly so. Yet, the same can be said of Apple.

So, sentiment is up-in-arms about this Cambridge Analytica scandal, and the #deleteFacebook retribution is a means of catharsis for those who think it’s just desserts for a rotten culture. That’s fine. But, Facebook had already patched this when they shut-down the API in 2015 — and sure as lightning doesn’t strike twice, they’ll always remember to incorporate requisite structural assurance in the future.

The same cannot be said of Apple at this point. To be sure, whatever Facebook data had gushed into 3rd party hands still remain in those hands today, but at least Facebook stopped the flow. How much data from other App Store products are still leaking?

I don’t want a pound-of-flesh from Apple, but the power of public sentiment should acknowledge Cupertino’s deficiency here too — in that Facebook is not the only weak link in the chain — so everyone remembers to incorporate structural assurance in the future. Otherwise, the system will be no more robust tomorrow than it was yesterday.

This brings me to a point I discussed in “The user data arbitrage and the right regulatory remedy”. Big tech has to choose (or be made to choose): Are you explicitly or implicitly responsible for user data? You can’t be half pregnant. With great power comes great responsibility. Great power without responsibility creates an inextricable arbitrage, which features vulnerabilities like the ones discussed herein.

Without big tech assuming explicit accountability, the onus falls on consumers. This isn’t regulating an automated content curation algorithm like Facebook’s News Feed, nor the manual curation decisions of a newspaper’s editor. Both of those are highly subjective art forms that — while opaque — literally embody the nominal products consumers sign up for. No, rather, this is consumer protection, with preexisting laws and standards already en force to govern privacy. As usual, Tommy Boy said it best:

“I can get a good look at a T-bone by sticking my head up a bull’s a$$, but I’d rather take a butcher’s word for it.”

When it comes to such enforcement, regulation usually leverages the points of significant aggregation along the supply chain, for obvious reasons. In contrast to Facebook — one of many apps in the industry — Apple and Google are centralized clearing houses for almost every app out there. Both are uniquely suited to enact consumer protections, given their hard-earned, well-deserved, privileged positions at this distribution bottleneck.

This would not be newly anointing Apple as an omnipotent moral arbiter, because they’ve already assume the role! Normally, we’d have to worry about the second-order effects of implementing such regulation — the unintended consequences — but I’d rather codify Apple’s role therein than let them continue to operate with such impunity.

We can debate App Store criteria all we want, but the fact is that Apple already governs to the letter of its own law, which is written. I’d rather formalize that than leave it flapping-in-the-wind. What use is the App Store review process if Apple can wash-its-hands of its standards whenever something goes wrong!?

Annotote is just a better way to read: Highlights by you and for you — on all the blogs, news, and research you need…