Human Rights in the Digital Age

In a Data-Driven Economy, Human Rights are Catching Up

e’ve all heard it by now. All sorts of corporations, from the infamous tech giants to your neighborhood e-commerce retailer, are getting their hands on our personal data.

I remember when I realized just how creepy this was, when a friend of mine suggested I look at the “Advertising Preferences” page associated with my Facebook account, where the site listed the information it used to personalize my “ad experience.” Some of the things Facebook knew about me surprised me; my racial identity, political affiliation, and the fact that I had briefly considered a PS4 deal in 2017’s Black Friday were all associated with my Facebook Profile.

By “Facebook Profile,” I’m of course not talking about the one I’ve cultivated with the occasional profile picture change, but the one unbeknown to me that Facebook crafted from my online behavior. This information is sold to companies for record profits, and the market dominance of hyper-specific marketing has shown that data, not oil, is the raw material of the digital age. To get a sense of how this is all but certain, take a look at the five most valued companies on Forbes’ Fortune 500 list– four of them live and breathe this business model. Free services for us, free data for them. The one company of the five that doesn’t is Apple, which still tracks certain data, just not to the crazy extent of the others.

In light of all of this, why should we care? Although we solemnly nod our heads in agreement that this seems creepy, perhaps even dystopian, we don’t do anything more than that. In the realm of social media, for instance, though national polling shows that most Americans do not trust social media’s protection of personal data, usage remains steadfast. Zuckerberg himself stated no direct drop in Facebook Usage after the Cambridge Analytica scandal of 2018, as egregious as that was. How we feel and act towards the weirdness of data-gathering companies reminds me of my already-defunct 2019 New Years resolution; I know I should go to the gym, but my actions tell the true story. After all, whether or not people use a platform is what matters to advertisers and investors, not whether or not its users harbor philosophical enmity towards it.

I’m not going to waste my time trying to convince you that you should care that companies use your data, nor am I going to make a case for the moral responsibility companies have not to collect your data. I believe that it is a company’s right to collect data so long as it was agreed upon by the user at the start of her usage. I also believe that given the choice, I would still rather sell my data (and soul and first-born) to companies in order to get Google Maps and Instagram free of charge. As Hasan Minhaj says: “I am more lazy than I am woke.”

However, what I would like to bring to light is this: in a data-driven economy, the intersection between politics, economics, and technology have espoused a unique set of human rights that could not have existed more than two decades ago. These human rights, centered around personal data, are important for us to understand now, since they will be the cornerstone of how we see ourselves in an increasingly digital society for decades to come.

Recently, legislatures around the world are toying with the idea of Informational Sovereignty. In the context of human rights, the right to informational sovereignty concerns the degree of control people have over the data that pertains directly to them. It is centered around the idea that people should be able to control or even erase their personal data, even if that data was made and stored by a third party with their permission.

Legislation that gives consumers entitlements regarding their personal data is not new– there are many laws regarding personal data that exist in different industries, such as the United States’ Fair Credit Reporting Act of 1970, which mandated that banks disclose credit scores to their clients. However, it wasn’t until 2018 that we saw sweeping legislation that applied to all industries, arguing that informational sovereignty was a basic human right. Here, I’m talking about the European Union’s implementation of the General Data Protection Regulation last May.

The EU’s GDPR strengthens consumer rights regarding data

Shortened to GDPR, the regulation lays out the legal groundwork that specifies how the Right to Privacy (from the EU’s Charter of Fundamental Rights) applies to the digital age, and in practice strengthens consumer rights across the board. Its scope is massive– the Regulation applies to the handling of all data of European citizens, European residents, or anyone else who may happen to be in Europe. This applies to companies regardless of where the company’s based. The penalties for failing to meet these requirements are extreme– companies that do not follow the GDPR will cost a company upwards of 23 million USD or 4% of the company’s global revenue, whichever is higher. Here’s an overview of some of the legislation’s core tenets.

  • EU citizens/residents have the right to overview how their data is being processed
  • EU citizens/residents have the right to obtain a portable copy of their data
  • EU citizens/residents have the right to erase data completely from a platform
  • Companies may not collect data regarding “Sensitive Information,” including race, political affiliation, trade union membership, and biometrical data
  • Companies must notify those effected by a data breach within 72 hours of the breach

For the first time in history, the GDPR presents the world with a set of human rights that clearly define what consumers can demand in regards to the data that is stored by companies. And although the regulation has drawn critics who argue that tipping the scales too far in favor of consumer rights will hurt a data-driven economy, a better understanding of what led up to the legislation will allow us to appreciate just what GDPR is responding to.

Firstly, there’s the right to privacy. Articles 7 and 8 of the Charter of the Fundamental Rights of the European Union asserts the Right to Privacy. With algorithmic data processing increasingly prominent, there is an urgency to upgrade these rights to specifically address the digital landscape. While it is easy to agree that the “right to privacy” is a sound human right to be championed, enforcement of such rights is impossible without contextualizing it in regards to real-world implications.

For example, take the European Union’s invalidation of the Data Retention Directive in 2014. The Directive was initially established in response to terrorist attacks in Madrid and London in 2006 as a means to collect European phone and email communications for up to two years. Despite being an effective means of counterterrorism, the Directive was eventually repealed for its interference with privacy rights, months after Edward Snowden leaked numerous global surveillance programs to the world. Without clearly stated laws regarding how citizens’ data relates to their rights to privacy, it was not until political turbulence and public uproar that the Data Retention Directive was repealed.

Protests in the wake of the Snowden revelations

Secondly, many national legislatures, particularly in Western nations, have set a precedent for legal protections of reputation. Defamation, while not regarded as illegal, is seen as an unjust infringement on others’ rights. We see that this is a cultural cornerstone of the United States, whose historical origin is based on immigrants wanting to start anew. Data collection by third parties was seen as antithetical to this philosophy, so we can see that although the data-collection business models came to characterize the global economy only centuries later, there was already a deeply-ingrained stance against it. For example, in 2014, European courts ruled in favor of Mario Gonzalez against Google, when he argued that Google linking his social security debts to his name encroached on his “right to be forgotten,” a right related to the right to privacy.

Third is the rising frequency of cyber attacks. This is hardly surprising; data-breach headlines have saturated the news. In 2018 alone, we saw large scale data breaches for an array of high-profile organizations, such as Google (52.5 million), Quora (100 million), and Marriot (half a billion!) In response to this, the European government felt an urgency to protect its citizen’s data. Furthermore, recent intelligence from multiple governments have confirmed that Russia-backed hackers have used personal data to target certain individuals in efforts to influence political elections, most noticeably the hacking of the Democratic National Committee in 2016 during the U.S. elections. Heightened cybersecurity is perhaps a government’s only option to protecting the validity of its democracy, without directly provoking Russia (or whoever else meddles in elections).

These conversations around our data may seem tedious, but deserve our attention. Although the data-gathering business models of the internet are well established, the laws that balance these economic forces with human rights are only just catching up.

Diego Encarnacion is a Design Researcher and User Experience Designer at IBM.

Get the Medium app

A button that says 'Download on the App Store', and if clicked it will lead you to the iOS App store
A button that says 'Get it on, Google Play', and if clicked it will lead you to the Google Play store