Cari McGee / http://www.carimcgee.com/

Let’s Audit Facebook

Is Facebook living up to Privacy Promises? Let’s test Real Deletion first

Phil Wolff
Digital Justice
Published in
5 min readJul 23, 2013

--

Sometimes you must kill your your digital self.

Maybe just a bit of your online life, where you said something stupid or hurtful or wrong. Maybe your photo that crossed someone’s privacy. Or you must go dark, to protect your family from criminal reprisals, to comply with a court order, to start over.

So you want to delete your bits. And failure causes harm.

Facebook permits you to delete a picture, a post, a relationship, or even your whole account. And maybe, when you hit the buttons or click the links or tap the spot, Facebook does, in fact, delete your bits.

Or maybe not. Maybe your data lives on in databases used for backups. Or in caches used to speed up data delivery. Or in test suites used to validate new code.

I don’t know. Do you?

How can you tell? Where’s the proof?

Even engineers at Facebook might not be able to tell if all copies of my data were even found, let alone wiped. Facebook’s tech infrastructure is vast, complex, imperfectly documented, and changes hourly.

So how do we trust Facebook? How can we trust any service to completely delete my data?

Independent audit.

Доверяй, но проверяй.
[Trust, but verify]
– Russian proverb

Bring in outsiders to follow the flow of data, tracing where data goes and is stored. Test how well Facebook lives up to the promises of its user interface and Terms of Service. And tell the users what they found.

Auditors could report gaps in coverage. Users could decide how much those gaps matter. Facebook could prioritize the work to fill those gaps.

The system would get better.

And users could trust Facebook promises more.

Facebook would only do this a few times before they chose to make their systems easier to audit. This would shrink the cost, time, and effort per audit. It might even become part of the code check-in process. Routine, fast, easy, comprehensive.

And our personal data would go to that bit bucket in the sky, just like Facebook promises.

Facebook isn’t the only service with this problem, of course.

Millions of apps, millions of companies, millions of web sites hold personal data. And we don’t know if they do what they say with our data.

I have over a thousand mobile apps. Most talk to servers. They know when I’ve played games, where I’ve been, who I call, and everything else I might treat as confidential.

And most of those software publishers are tiny. They don’t look like Facebook. They are small businesses.

So we don’t trust them to delete our data for different reasons. Facebook has challenges of being big, complex, and nobody able to know everything about how their systems work. A small system may fail because products are rough and tumble under the covers. Small fry fail, sometimes selling their systems and software and data to those we don’t know and shouldn’t trust.

So outside audit means something different for small companies. It means testing deletion is deletion, of course. It also means checking the business is healthy and their plans to bind successors are compelling.

There’s great comfort in knowing you can trust apps.

But these apps share data. With each other. With data aggregators. You rarely know to whom they pass your data, let alone know what the recipients did with it.

Is Real Deletion across the personal data supply chain an impossible problem?

So let’s imagine Facebook makes real deletion a standard feature, a core promise.

Sadly, Facebook already shared your data with dozens or thousands of companies. Some company or government has part of your Facebook profile, combining it with other data to make a vivid, inaccurate portrait of your life and character.

Can Facebook pass on its commitment to delete your data to its paying data customers? To its independent app developers?

Facebook could.

Facebook can add the obligation to delete-on-demand to its partners. They could write it into their contracts. They could build it into their APIs.

And then Facebook would have to trust their partners to live up to these commitments.

Or not.

Facebook could add open, independent audit to those same contracts, to those same APIs. So Facebook could trust that when a user deletes a picture, Facebook’s partners really delete it too.

And we’d trust Facebook’s partners more.

And we’d trust Facebook more.

Real Deletion is the No Brown M&Ms Rider

Some day Facebook could show you the fresh, independent audit that tested they really delete your data, that their paying customers also delete your data, and that their partners are audited too.

If you want Van Halen to play, serve them a bowl of M&Ms without the brown ones. This request was slipped in with the engineering specifications as a quick way to check how well the venue was paying attention to detail.

Real Deletion is the No Brown M&Ms rider. Real Deletion shows one user “right” is defined and verified throughout an ecosystem.

And maybe we can extend this capacity to other rights.

Like notifications of data breach. So we can protect ourselves.

Like notification of data loss or corruption. So we can quickly recover.

Like a graceful exit when features, products, or accounts are suspended or ended. So we don’t damage our relationships and have a shot at continuity.

Like the ability to export a copy of all our data, Google Checkout or MyNSA style. So we can move on.

Like disclosing through which legal jurisdictions our data flows and is stored. So we behave legally around the world.

Chrono-Shredder by Suzanna Hertrich via Wicker Blog

Do you remember deleting that thing last year? The one that could cost you your job, your health care, your family, your reputation?

Is it gone?

Really gone?

--

--

Phil Wolff
Digital Justice

Strategist, Sensemaker, Team Builder, Product guy. Identity of Things strategy (IDoT) @WiderTeam. +360.441.2522 http://linkedin.com/in/philwolff @evanwolf