Why Google Was Hit With a 57M GDPR Fine and What It Means For the Future of Data Privacy
On January 21st, Google was fined $57M for breaching Europe’s stringent new data privacy regulations, GDPR. In a statement made by French data protection agency, Commission Nationale de l’informatique et des libertés (CNIL), the company was fined due to deficiencies in the way user data was processed when presenting personalized ads.
The investigation was first opened in May of last year, when CNIL received two complaints from associations NOYB and La Quadrature du Net (LQDN). Both associations were mandated by thousands of their users to reproach the US search engine for not having any legal or valid reason to use their personal data. On further investigation, the French regulator also found Google to be in breach of further GDPR rules, including lack of transparency and inadequate information being made available to users.
Why This is Good News For the Future of Privacy
This is the first time a substantial fine has been levied under GDPR, which is a huge win for data privacy advocacy. GDPR is a reminder to companies that user data should ultimately be owned by the user and it is the responsibility of every organization to protect, secure, and respect that information. Privacy was once an ethical decision.
With the introduction of GDPR, the enforcement of privacy policies is about more than integrity, confidentiality, and respect — it’s a pragmatic business decision. Failure has a substantial and measurable cost. Data ownership has largely shaped the digital economy as we know it.
“Personal data is the new oil of the internet and the new currency of the digital world.” — Meglena Kuneva, European Consumer Commissioner, 2009.
From browsing habits to data generated by connected devices in the Internet of Things, data is helping businesses understand their customers and, most importantly, what they want. However, it is crucial that this data is treated in the right way, and this is where GDPR comes in. Designed with the interests of the general public at heart, the regulation helps to build confidence and trust between users and the organizations they share their data with.
When Personalized Ads Get Too Personal
As Google tracks user behavior, it gathers information that is processed to target personalized ads. According to CNIL, highly sensitive and intimate data was processed for targeting purposes, including data related to substance abuse, male impotence, sexually transmitted diseases, extreme political views, and mental health.
Let’s look at this a different way. While it is acceptable for a public library to label a book section with the words “mental health”, it would not be acceptable for a library to label a person who visits that section with those words. However, in the digital online world, these labels track what you browse, read, listen to or watch can and stay with you for a long time.
It’s important to state that harnessing user data to create personalized ads is not a bad practice in itself. Indeed, some personal data is required to ensure that ads stay relevant to the users they are targeting, and when processed in a compliant and ethical way, this method of targeted marketing works very well for consumers and organizations alike. Unfortunately, in the case of Google, personal data was used without valid consent and in a way that went far beyond the information required to create relevant ads.
What Does No Valid Consent Mean?
In accordance with GDPR rules, valid consent is when a user gives their consent in full for their data to be processed for purposes such as personalized ads, speech recognition, etc. The investigation against Google found that the option to give full consent for data to be used for personalized ads was pre-ticked when users created a new Google account. This is a clear violation of the GDPR rules. The regulator reiterated that it was Google’s “utmost responsibility to comply with the obligations on the matter.” Google is currently preparing an appeal against these accusations and penalties.
GDPR: It’s Not Just a European Thing
It’s important to remember that GDPR not only applies to companies located within the European Union (EU), but also any organization that offers goods or services, or monitors the behavior of EU data subjects. Any business, wherever its location in the world, will be regulated by GDPR if they process the data of any EU citizen. That means that all penalties and fines will apply, too.
The rest of the world is starting to take note by putting procedures in place to ensure GDPR compliance. In July 2018, the EU successfully concluded talks with Japan on reciprocal adequacy. This means that the two territories have agreed to recognize and trust each other’s data protection systems and allow data to flow safely between them. This mutual adequacy agreement will create the largest area of safe data transfers in the world.
What This Means For The Future of Privacy
GDPR can dramatically impact companies of all shapes and sizes. While you may not be subject to multi-million dollar fines, the financial and reputational damage of a privacy scandal or data breach can be difficult to recover from. As the world becomes increasingly driven by data, it is essential that organizations collect, store and process user data in a way that not only aligns with the law, but also with customer expectations.
To avoid the same fate as Google, organizations must ensure that user data carefully collected, well organized, stored securely, and is only processed if consent has been explicitly given in line with GDPR’s consent rules. Full details of the regulation can be found here.
We at Passbase are building the tools to help businesses complete strong user identification, without needing to worry about storing, processing, or protecting sensitive personal identifiable information. We are strong believers in privacy-by-design and hope the tools we provide can empower businesses to create a more privacy-centric future and give people back control of their digital identities.