An aggressive interpretation of GDPR may be the only way to fix Facebook

For anyone in the tech community who watched the Facebook senate hearings, it was clear; politicians in the US have no idea how to challenge Facebook when it comes to privacy. Does the EU stand a better shot?

Sean Rioux
Artwar

--

If you do business in the EU, you should already have more than a working knowledge of GDPR. With sweeping legislation set to come into effect this year (May 25), these policies and their pervasive power will reshape the digital landscape in effected countries.

Put broadly the General Data Protection Regulation applies to entities within EU states, and are meant to ensure the individual’s right to control personal data. compliance comes with strict penalties, and rather than a reactive policy it requires a proactive “data protection by design, and by default” approach.

Facebook has already publicly stated it will not consider implementing the full requirements of GDPR outside of the EU, but what if it did? While Facebook already has as a platform taken responsibility to ensure certain provisions (the right to erasure, and data portability are accounted for) perhaps a deeper acceptance of the tangential nature of the policy might just be enough to quell the backlash and bring accountability to the platform.

If we look at the problems facing Facebook (which certainly go beyond simply privacy and ownership of data), and we apply GDPR perhaps we see a framework for a business model, which while potentially less profitable, may be much more sustainable and good for it’s users. Let’s look at some of the key stipulations of GDPR and how it might apply to Facebook.

Consent

All activities whereby personal data is processed, requires consent. This consent must be explicit and not implied, meaning you must explain explicitly how this data will be used. Facebook affirms that when a user consents to it’s terms of use, or consents to an application to use personal data (for example to access your profile information) that this is sufficient to determine your consent. The issue with Facebook’s current approach is that it uses a generic set of terms, for each access permission and each application.

When I confirm that an application can use my profile data, I make an assumption about why it is doing so. For example, when I setup an account with say, Tinder, it is my personal name, my profile image, and my gender that are carried over to the app. While the specific data transferred may be explicit the key criteria here is that both applications, as agents processing my data must be explicit in how that data will be used. Will Tinder use this data for research regarding it’s user base? Will Facebook use this data be to market products to me in the future?

GDPR sets the tone here. Consent must be:

  • Unambigiuous
  • Freely given
  • Specific
  • Informed

It isn’t simple enough to define the access permissions, and imply consent for future use based on the specifics of what data is shares. Any intent to use the data, in any way, must be explicit. It doesn’t matter whats in the terms of use for the application or my data access. Facebook has built the framework to provide access to the data, even asynchronously through it’s API. Is it not therefore responsible for ensuring unambiguous, freely given, specific and informed consent to share and process the data?

The semantics of this are much deeper (for example, “freely given” is explicit that user must be given a choice to refuse the terms, without detriment) but even at it’s most surface level (it can be argued) Facebook is complicit in how it allows applications to access your data.

If Facebook were required to establish this definition of consent, it would become much more accountable regarding personal data. Privacy settings would be default, at even the most basic level. They would be required to vet the ways data is used and stored, and processed across systems. In-effect, the application ecosystem that lead to the current Cambridge Analytica scandal might have been avoided.

Right of Access

Users have the right to access any personal data about themselves, and know how it is being used and proceed. Again Facebook (and Mark Zuckerberg in his recent senate appearance) would affirm they do this. You can look at applications which use your data, and see how it is being used. You can even export the data they have stored about you (see data portability in a later section).

Perhaps this is enough. One key point here is worth noting, you have the right to see how your information is used and processed. Processing is a loaded term in digital, as of course data processing can mean a lot of different things. Data is passed to a function. The smallest snippet of Javascript may alter data in some way, effectively processing. Of course much of this processing is of limited concern to and end user. Until we start to consider things like Facebook’s feed.

In media circles there is a persistent discussion about the specifics of the Facebook feed. How does it decide what posts to show, and in what order? In most simple terms it’s an algorithm. Like in the film, The Social Network, where an equation is scrawled with whiteboard marker on a dorm room window, Facebook uses such algorithms to determine what to show you, who you could know, who you are. In these equations, you are a variable, and this is how your data is used.

We may understand that, but GDPR forces us to question the how, as in, “how is personal data being processed?” in this case to determine what information or programming, I’m being fed. Tech companies tend to be secretive about their algorithms, and with good reason. Companies which better understood how user data was processed, might be able to game that processing to improve the “rankings” of their content vs. others (aka SEO).

Facebook should be required to be transparent, and fully transparent to how my data is being used and processed.

So how explicit should we interpret this requirement of GDPR? Frankly, I see very little choice but to interpret it quite literally. Facebook should be required to be transparent, and fully transparent to how my data is being used and processed. Under GDPR it should be the right of a European citizen to see not only what is stored, but to peer into that source code, or at least have it’s effect communicated meaningfully. What about intellectual property you say? When democracy is at risk are we really willing to squabble over snippets of code?

In implementation, this could be quite simple. When I’m being advertised to show me the demographics data used directly on the ad. Show me why I am being targeted. Like a warning on cigarette package, an informed decision against a decision which might be bad for me, serves a greater good. If this kind of thinking has advertisers scared, then good. Make a better product that people actually want, and they will be happy to see the reasons you targeted them. Not because you desired to manipulate them, but because you we’re the right fit based on the data available and explicitly shared (authenticity goes both ways).

Right to Erasure

I have gone through the process of deleting my Facebook account completely, and yes it allows this. Unfortunately, this policy doesn’t extend to all of the other apparently insidious agents, collectors or processors, who through implied consent have scraped, pulled, or otherwise catalogued my data through Facebook.

When bad actors are able to deny a users right to erasure, after leveraging Facebooks built in functionality to facilitate this with non-explicit permissions should Facebook not be held accountable?

Now is a good time to make a key point. GDPR may be most effective, not because of ensuring companies meet certain “feature requirements” for privacy. It’s perhaps bigger than that. Taken together the various policies form a web of accountability. The right to be forgotten, taken alone, implies that companies must provide me with a feature which allows for me to request my data be removed. But on the web, this data is pervasive, across platforms, and easily duplicatable.

This right, if most effectively interpreted should perhaps be extended beyond Facebook, to every platform that uses Facebook data. When bad actors are able to deny a users right to erasure, after leveraging Facebooks built in functionality to facilitate this with non-explicit permissions should Facebook not be held accountable?

Perhaps this is a pipe dream, or an overreach of various tangle policies. Perhaps it is simply not technically feasible. But it speaks to what might be necessary as the web evolves, and technologies like Blockchain become more pervasive. Is immutability a feature or a bug?

Data Portability

Again something Facebook offers, at least superficially. A user is able to export their data for transfer in a structured commonly used electronic format. Without going on a rant about the history of personalization standards on the web, perhaps the zip file with text formatted copies of my chat history, and all the assorted jpg files I have uploaded constitutes data portability.

But again we come up against how pervasive GDPR could actually be. Exporting “my” data is perhaps a bit more complicated. There is data I have explicitly posted, but there is also data which is generated from my data. This is again, my data processed (which under GDPR Facebook must be explicit about). Should I not be able export the results of the data processing as well? What would this even look like?

If anyone has ever done any kind of content analysis on a technical level, you might be familiar with methods like keyword extraction, or sentiment analysis. Basically content (for example a status update I post to my wall) can be analyzed programmatically for better machine processing. This can be used to establish more robust models regarding my interests, my bias or my opinions (above beyond the things I have most “liked” in the most boolean way). This processing has tangible outcomes (data). A graph of my interests based on keyword usage in posts, for example.

In my humble opinion, this, is also my data. It is generated from me, and so it is me at least partially mine, and since I have the right to see how my data is processed doesn’t it stand to reason that I should be able export this data, if not have access to it?

These kind of utilities have been the holy grail of data marketing. Looking at the correlation between demographics data, interests, keywords, even sentiment. If you could know everything about a person, more than even their own mother, how could you influence their decisions? If this data in it’s processed form is so valuable at scale to advertisers, it can be assumed that, at least under GDPR, I should have a right to access it for myself?

Senator…

GDPR is an admirable model towards better regulation of digital privacy, but frankly, it may not be enough. This is despite the fact, that what I’ve outlined above honestly barely scratches the surface of the regulation, which includes things like breach notifications, and fundamental design principles). My hope is that in it’s interpretation EU regulators take a stand and challenge companies like Facebook, Google, and Amazon aggressively to innovate in more socially responsible ways.

But, in the US, the tepid response from Mark Zuckerberg to the equally tepid inquiry we’ve seen from the US senate hearings, proves that these leaders are woefully unmatched, unequipped, and fundamentally lack the ability (or perhaps intelligence) to fathom the basics of how digital information works (let alone how privacy might). Without more effective leadership perhaps we must simply resolve to sacrifice any right to privacy, or right to our data, under the hospice of free services, and pervasive communications platforms like Facebook shaping the future of social interaction.

In my opinion, we need to step back. Perhaps we need to slow down. Perhaps we need aggressive, stringent regulation, even more so than GDPR. Free market be damned. In a world where democracy is at stake, where global stability may be at risk, is it really worth defending a company which was started in a dorm room?

As a Canadian, in the tech industry, the idea that GDPR regulation might come to Canada honestly somewhat scares me, and our clients should also be scared. We are woefully unprepared. But looking at the outcome in the US election? Or Brexit? I’m ready for the costs, opportunities, and the challenges that come from “data protection by design, and by default”.

Is Zuckerberg?

--

--

Sean Rioux
Artwar
Editor for

Digital Strategist. Information architecture, UX design, and web technology. I’ll take the window seat.