“This case forces Uber to do a lot more than ever before”, says data protection expert René Mahieu

Charles Foucault-Dumas
PersonalData.IO
Published in
11 min readMar 30, 2021

On 11 March, the Amsterdam court* delivered its verdict in the trial between four VTC drivers and the Uber platform. Banned from the application for “fraud”, the drivers contested this charge and asked the court to annul the deactivation of their account, which they said was carried out by an algorithm and not by human beings. While this ruling is a half-victory (and therefore a half-defeat) for both parties, it did test-fire Article 22 of the EU’s General Data Protection Regulation (GDPR), supposed to protect individuals from unfair automated decisions. Economist and philosopher, René Mahieu is conducting his thesis on these issues at the Research Group on Law, Science, Technology and Society of the Vrije Universiteit Brussel (VUB). He explained us the consequences of this court decision for platform workers and beyond.

René Mahieu, doctoral researcher at the Research Group on Law Science, Technology and Society of the Vrije Universiteit Brussel (VUB)

*The Dutch court has competence because Uber has its European headquarters in The Netherlands.

At its trial in Amsterdam earlier this month, Uber wanted to establish that workers were abusing the GDPR. On the opposite, the court demonstrated that “drivers taking collective action to seek access to their data is not an abuse of data protection rights.” What could be the consequences of such a decision on the relationship between platform workers and platforms?

There is a tradition of collective actions in the Netherlands. Generally, courts deny when companies argue that collective request is an abuse of rights and accept the collective requests. There was one particularly emblematic case during the sub-prime crisis: with the help of a popular consumer protection TV show, thousands of clients of the bank Dexia asked for transcription of recorded conversations with their financial advisers selling them risky products. Dutch Supreme Court decided that access had to be provided. The clients used the obtained files in a class action lawsuit against the bank.

“Standing collectively for your rights works within the GDPR.”

If not identical, this Uber case is similar. With this verdict, the court states that standing collectively for your rights works within the GDPR. The consequence of this trial will be twofold: we will see many more of these collective actions and the companies now know that the abuse of rights argument does not work.

So why does the media coverage of this trial (like in TechCrunch) point to a good result for Uber?

First, the decision seems good for Uber, because the Amsterdam court put a high (I would say unreasonably high) burden on the drivers in specifying exactly which data they want to receive from Uber. As Jill Toh — from University of Amsterdam — says in the Tech Crunch article you refer to, it’s difficult for people who request access to be very specific, because they do not exactly know which data is being processed. That is why they often refer to the quite general categories of data mentioned in the company’s privacy policy. It is exactly the purpose of the transparency principle in general and the right of access in particular that people would be able to get a far more precise understanding of which data is being processed and for what precise purposes.

But not all courts are as restrictive as the Amsterdam court was here. The Vienna High Court, for example, recently forced Facebook not to restrict access to selected personal data that it thinks is of interest to the user. This court ruled that Facebook must provide all personal data if the user requests it.

The other apparently positive result for Uber concerns the automated decision-making. In this trial, the four drivers complained that their account were suddenly suspended, automatically, by Uber’s algorithm. Uber told them it was because they committed fraud. According to Article 15.1 (h) of the EU General Data Protection Regulation (GDPR), you have got a right for an explanation when a company made an automated decision about you. “Why am I not allowed to work anymore?”, asked the drivers to Uber via the court.

“You have got a right for an explanation when a company made an automated decision about you.”

Article 22.1 of the GDPR states that “the data subject shall have the right not to be subject to a decision based solely on automated processing, including profiling, which produces legal effects concerning him or her or similarly significantly affects him or her.” Uber claimed that when the system detects something that may be a fraud, then the Uber driver is automatically suspended but the platform lawyer added that if the driver calls Uber, Uber will reactivate his account. Thus, is this a “legal effect” or a “similarly significant” effect? The court thought it was not the case.

However, then, if there is another event of fraud, the VTC platform ensured that two people at Uber decide whether the drivers will be suspended or not. The company therefore asserted that the suspension is not automatic since it is decided by human beings. That argument was accepted by the court because the account deactivation is not “based solely on automated processing” as written in article 22 of the GDPR.

It makes sense. But do we know if this human procedure really exists? In a court of law, is it up to the drivers to prove that it is false or up to Uber to prove that it is true?

Neither. If Uber says it is like this, the drivers have to argue that it is plausible that it is false. And the drivers didn’t. I don’t know why. And as they didn’t, the court considered that it works like Uber said. You know, Uber may really do as they described. In that case, it’s fine. They don’t robo-fire people.

Anyway, this aspect of the case offered two positive elements to the drivers. First, Uber does have to tell a driver why he or she was fired. Paragraph 4.29 of the verdict stipulates that “Uber must grant [driver 2] and [driver 4] access to their personal data in virtue of article 15 of the GDPR, provided that these data formed the basis for the decision to deactivate their accounts, in such a way that they are able to verify the correctness and lawfulness of the processing of their personal data.” With this ruling, Uber cannot say anymore “we didn’t know we had to do it”. They will have to justify the deactivation if the driver asks for it.

“A driver’s interest in knowing why he or she was fired is more important than Uber’s interest in protecting its secrecy.”

Secondly, when Uber said that giving drivers these specifics was a threat to their business the court rejected the objection. Uber argued that explaining to the driver why he was fired is giving insights into the parameters of how their fraud detection system works, which would allow anyone to circumvent it. The court judged that Uber did not detail enough why and how this could happen and dismissed the argument. This means that a driver’s interest in knowing why he or she was fired is more important than Uber’s interest in protecting its secrecy.

I’m not surprised. Companies often lose with this kind of argument. Courts generally consider the rights of people to be more important than vaguely defined arguments that companies often bring in favour of protecting their interests.

If it’s up to the drivers to show that it’s plausible that it is false, how can they do that this without access to the data?

This is a very difficult point. In this case but in many other cases as well. Here, Uber gave a very detailed description of their procedure. They had to. In doing so, they offered an opportunity to drivers: if Uber has these two persons who make the deactivation decision, they probably have a human resources file or record about their drivers. And such a file is personal data. So, the drivers can say “give me the opinion of those two people about me”, since article 15 of the GDPR gives anyone access to his or her personal data.

Did the drivers ask for it?

It seems they didn’t submit that specific request. Maybe because they didn’t know these two persons were involved. Now they know, they might ask for it either as an appeal, either as an access request.

Would drivers obtain that, knowing that the ruling rejected their access to manual notes, tags and reports about them? And isn’t this decision surprising when, since the end of 2017, the Nowak ruling allows a student to see the comments about him/her written by his/her marker on an exam paper? Are the situations of a VTC driver and a student not similar in this respect under the law?

It is indeed surprising that the court completely rejected access to “internal notes” that are part of the drivers’ profile, as well as tags such as “inappropriate behavior”. On this matter, two cases by the European Court of Justice (ECJ), the highest Court in Europe, are in contradiction with each other: In the 2014 YS and Others case the ECJ applied a rather restrictive scope of the right of access, while in Nowak you cite, the scope was much wider. The Dutch court in this Uber case leans much more on the first ECJ decision than on the second. If we follow the latest ECJ decision, which would be appropriate in this case, this kind of internal notes should probably be considered personal data in most cases.

A specific problem in Dutch case law should be highlighted here: in the famous Dexia case discussed earlier, the Dutch Supreme Court ruled that access to internal notes containing the personal thoughts of employees do not have to be provided if they are not part of a filing system (paragraph 3.14 of the decision). What I think the Court meant was that notes taken by employees in a notebook during a conversation with a colleague, which are not subsequently added to the personal file of a client (or, in this case, a driver), do not fall under the right of access.

But if these notes are entered directly in a digital system, and especially if they are part of the “file” of the person requesting access, they should fall under the right of access. Sadly, several courts in the Netherlands have misrepresented the decision of the Supreme Court to mean that internal notes do not fall under the right of access as a general category. Luckily, the Council of State (court of last instance in public law) recently ruled in very clear terms that this common interpretation is indeed wrong (paragraph 7.1 of the ruling), and that employees’ personal notes are not categorically exempt from access.

Uber cites customer privacy as a reason for not providing more data to drivers, how is this related?

Article 15.4 of the GDPR says that your right to access your own data “shall not adversely affect the rights and freedoms of others.” As a comment from a passenger is also the passenger’s personal data, Uber said that giving it to the driver is detrimental for the privacy of its clients.

Are there technical and/or legal means to satisfy everyone?

It is actually a good argument. But Uber can give the rating to the driver without disclosing the identity of the passenger nor the exact date, time and location it was given. In this case, the court ordered Uber to share the ratings with the drivers with anonymity. They will likely do so, as they already did it in the past, for some drivers.

The content of the comment sometimes gives a clear indication of the author…

True. It’s a tricky question. There can always be some case which will create a debate. If it happens, Uber has to do a balancing test between the interest of the driver in getting access and the interest of the privacy of the passenger. If Uber decide to not share a comment with a driver, the driver can ask a court to arbitrate.

Your colleague Mireille Hildebrandt, lawyer and philosopher, commented the ruling on Twitter that “the Court does not understand the crucial importance of automated profiling based on driver behaviours, and does not understand core transparency requirements in the GDPR.” Our society is in trouble if the courts do not understand the world we live in. How to solve this?

In general, I agree that often judges do not know a lot about platform economy, digital economy, digital rights, artificial intelligence, automated decision making… These are very complex topics. And because of it, we also have the national data protection authorities. The GDPR says every European country must have such an authority, because the regulators know that these are extremely complex topics.

Those data protection authorities are supposed to have very specialized employees to effectively deal with these complex issues. If you have a problem related to your personal data, you can go to court, but you can also go to your national data protection authority, which has the power to give a fine. The problem (and I wrote an article about it), is that in many countries, including the Netherlands, data protection authorities are completely overworked. They don’t have enough money. They don’t have enough people. Giving them the means to fulfil their missions would be a great step in solving the issue you mention.

What are the natural next steps for local drivers, for example for the 3000 drivers working for Uber here in Switzerland?

This decision of the Amsterdam District Court gives a lot of hope to Uber drivers. I think it is logical that they continue to use their rights collectively. But the drivers need to find help, a legal support to analyse this ruling in details.

“Drivers should ask Uber what in their algorithm explain why at the beginning they get plenty of rides and then, suddenly, way less.”

Now they know that if they get blocked, they have the right to know why. If they are allowed to request access to the reviews, they could investigate if there is racism in it that can explain bad ratings. This could then be used to force Uber to do something like not considering these ratings relevant.

Some drivers told me that when you start being an Uber driver, you get a lot of rides but then, after one month or something, all of a sudden, you get a lot less rides. Drivers think this is an automated decision. And I think it does have “legal effect” or “similarly significantly affects” them and falls under the article 22 of the GDPR. That is something Uber should be asked about. Drivers should ask Uber what in their algorithm explain why at the beginning they get plenty of rides and then, suddenly, way less.

Beyond Uber, do you think this case will inspire other platform workers to get together to defend their rights?

Yes, I do. We discussed that this case is both happy and sad for the drivers. But even with this ambivalent result, it forces Uber to do a lot more than ever before. Uber is big and has a lot of money to spend. Drivers, on their own, they are not so strong. That’s why I think, to collectivise, to get together and make your argument together as a group of workers, as a group of employees, this makes a lot of sense. For Uber drivers, platform workers but also for consumers.

“This new Uber trial shows that the collective use of data protection law works and changes the power dynamics.”

You know, individuals are normally in a weak position. They are not specialist, they are just consumers/workers against a multibillion dollars company that operates globally… naturally, they will have the feeling they have no chance to win. But, by using the right in the data protection law and by doing so collectively as here with Uber or in the Dexia case I told you about earlier, they actually really get empowered, they get more strength that they would have had otherwise. What this rulings show is that this strategy works. It changes the power dynamics.

This interview was conducted by Charles Foucault-Dumas, author and journalist expert in the digital world.

PersonalData.IO is a nonprofit focused on making data protection rights individually actionable and collectively useful. If you are an Uber driver who would like to get a copy of their data, please follow our guide.

If you would like to regain control over data from other sources then please join the conversation in the PersonalData.io forum or follow us on Twitter.

Read related stories on the PersonalData.io blog.

--

--

Charles Foucault-Dumas
PersonalData.IO

Writing your story 🖋 #Biographies #Fiction #Communication Ex-Director of the Empowerment Foundation | Co-Creator of L’Usine Digitale ℹ️ www.foucault-dumas.ch