The Right to Privacy and Data Policies
Information, in the wrong hands, can be damaging to one’s livelihood. But in an age where personal data is being gathered at unprecedented rates, what can we really do?
Many data protection laws in place are underpinned by human rights laws — specifically, Article 8 of the Human Rights Convention. Within the European Union, this is doubly reinforced by rules set in place by the EU Charter of Fundamental Rights. The European-wide 2018 General Data Protection Regulation (GDPR) — which replaced the UK Data Protection Act (1998) — upholds these rights regarding access, use, and disposal of personal data. Due to the strong laws in place by these agencies, Europe currently leads the world in data protection.
One of these human rights laws pertains to the right to privacy; while closely related to data protection rights, they are distinct. Whenever personal data is used or processed, the right to personal data protection — which is broader than the right to respect for private life — comes into action. Any time the processing of personal data occurs, it is subject to protection, irrespective of how little it may impact one’s privacy. This right to privacy defines any situation where the private life of an individual has been compromised (e.g., intimate, sensitive, or confidential information; information that could prejudice the public’s perception against them; any aspect of professional life and public behaviour). This right is vital because the use and storage of personal data is meant to be in service of the people it is obtained from, not the entities using or storing it.
In European law, this is referred to as “the right to respect for private life”, and was first regarded as a fundamental protected human right under the 1948 Universal Declaration of Human Rights, and currently upheld by the European Court of Human Rights (ECHR). Under Article 8 of the ECHR, people are provided the right to have their own private and family life, home, and correspondence respected. Any interference in this right is generally prohibited, except in situations where the interference is in accordance with a nation’s law, is necessary for a democratic society, and pursues important and legitimate public interests concerns. (You may recall our previous article about the global use of COVID-19 track and trace apps, and the concern surrounding breached privacy laws).
In the past, human rights laws in Europe have protected many individuals from the potential of having their private data mishandled or misused. Due to the right to privacy being a fundamental, protected law, the ECHR has been able to intervene whenever these laws are violated. Examples include cases where a woman whose medical data was collected under ambiguous laws; required safeguards to prevent any nation’s abuse of spying power while recognising the need to counter terrorism efforts; made nations accountable for their use of widespread mobile phone interception devices (including the prohibition of phone-tapping of lawyers and the recording of prison visiting room meetings); and reviewed cases where employers monitor their employees’ computer use.
Unlike many European nations like the UK, where citizens are guaranteed the rights to privacy as a fundamental human right, countries like the United States do not guarantee such rights by law. (Information on Data Protection at the end of the transition period, post-Brexit, may be located here.) As mentioned earlier, this lawful right to privacy has implications for strict data protection. When the right to privacy is not enshrined in — and upheld by — the law, it can put many people at risk for a variety of reasons. With the exception of healthcare and financial data, the United States has no federal regulation as to how data can be obtained and used; every state and private corporation is self-policing, and can, to an extent, make their own privacy policies. In fact, commercial and surveillance practices have increasingly impeded on the privacy of individuals belonging to marginalised communities — including Black, Asian, Indigenous, and Minority Ethnic people, LGBTQ+ individuals, persons with disabilities, immigrants, and refugees — in both the physical and digital spheres. The collection and use of data on marginalised groups, often gathered from ads and social media platforms, has led to increased surveillance on these communities, further enabling racial profiling, discrimination, and violence aimed at these communities.
In 2016, ProPublica found that Facebook’s platform had allowed advertisers to exclude Black, Hispanic, and other groups (referred to as “ethnic affinities”) from seeing housing ads. Two years later, ProPublica once again found that Facebook allowed employers to selectively advertise job vacancies to users of specific genders. Alarmingly, this is not uncommon. Gender- and race-based discrimination occurs in the targeting of housing ads similarly to job listings, even if advertisers are not specifically targeting certain demographics. Instead, they can discriminate based on parameters highly correlated with race and ethnicity, such as inferred interests, or even geographic location. In fact, violent hate groups use online platforms to target specific racial and religious communities; violent hate speech on twitter and similar platforms has been shown to predict real-life racial and religious violence.
With online platforms like Facebook increasingly trying to collect and monetise people’s personal data, other tech giants have spoken out against their actions. Microsoft CEO Satya Nadella, and Apple CEO Tim Cook, have praised GDPR for being a great start to ensuring data protection and human privacy. They’ve claimed that more companies — and countries like the United States — need to follow in Europe’s footsteps to protect and uphold data policies that protect the most vulnerable of populations. Tim Cook has even gone as far as to state that personal information is being “weaponized against us with military efficiency”.
Where does this bring us, then? We can acknowledge that the right to privacy is a fundamental right, and that it should be a necessary adherent in any data policy. While there are strict laws pertaining to the collection and use of our data, the law lags behind technology and it’s all of our responsibility to use data responsibly. Ensuring Privacy by Design or Protection by Design when deciding how to collect, process and store customer and, indeed employee data, is a key way organisations may meet this responsibility and be accountable for their actions.