Uber and the Limits of Privacy Law

Chris Parsons
5 min readApr 26, 2017

--

Uber: pro e contro della start-up avversata dai tassisti by Automobile Italia (CC BY 2.0) at https://flic.kr/p/SiEv74

When was the last time that you thought long and hard about the information companies are collecting, sharing, and selling about you? Maybe you thought about it after reading some company had suffered a data breach or questionably used your data, and then set the worries out of your mind.

What you may not know is that most contemporary Western nation-states have established data protection and privacy legislation over the past several decades. A core element of these laws include data access rights: the right for individuals to compel companies to disclose what information the companies have collected, stored, and shared about them.

In Canada, federal commercial privacy legislation lets Canadian citizens and residents request their personal information. They can use an online application to make those requests to telecommunications companies, online dating companies, or fitness wearable companies. Or they can make requests themselves to specific companies on their own.

So, what happens when you make a request to a ride sharing company? A company like Uber?

So, what happens when you make a request to a ride sharing company? A company like Uber? It might surprise you but they tend to provide you with a lot of information about you, pretty quickly, and in surprisingly digestible formats. You can see when you used a ride sharing application to book a ride, the coordinates of the pickup, where you were dropped off, and so forth.

But you don’t necessarily get all of the information that ride sharing companies collect about you. In the case of Uber, the company was recently found to be fingerprinting the phones its application was installed on. There’s some reason to believe that this was for anti-fraud purposes but, regardless, the collection of that information arguably constitutes the collection of personal information. Per Canadian privacy legislation, such information is defined as “information about an identifiable individual” and decisions by the Commissioner have found that if there is even an instant where machine identifiers are linked with identifiable subscriber data, that those machine identifiers also constitute personal information. Given Uber was collecting the fingerprints while the application was installed, it likely was linking those fingerprints with subscriber data, even if only momentarily before subsequently separating the identifiers and other data.

So if Uber had a legal duty to inform individuals about the personal information that it collected, and failed to do so, what is the recourse? Either the Federal Office of the Privacy Commissioner of Canada could launch an investigation or someone who requested their personal information from Uber could file a formal complaint with the Office. That complaint would, pretty simply, argue that Uber had failed to meet its legal obligations by not disclosing the tracking information.

‘Enforcement’ tends to be limited to moral suasion when applied by the federal privacy commissioner.

But even if Uber was found to have violated Canadian law there isn’t a huge amount of recourse for affected individuals. There aren’t any fines that can be levied by the Canadian federal commissioner. And Uber might decide that it doesn’t want to implement any recommendations that Privacy Commissioner provided: in Canada, to enforce an order, a company has to be taken to court. Even when companies like Facebook have received recommendations they have selectively implemented them and ignored those that would impact their business model. So ‘enforcement’ tends to be limited to moral suasion when applied by the federal privacy commissioner.[1]

But the limits of enforcement only strike to a part of the problem. What is worse is we only know about Uber’s deceptive practices because of journalism. It isn’t because the company was forthcoming and proactively disclosed this information well-in-advance of fingerprinting devices. Other companies can read that signal and know that they can probably engage in questionable and unlawful practices and have a pretty low expectation of being caught or punished.

Fong argues that application stores — such as Google’s and Apple’s respective App stores — could include comprehensive data access rights as part of the contracts that app developers agree to with the app store owners.

In a recent article published by a summer fellow for the Citizen Lab, Adrian Fong argued that enforcing data protection and privacy laws on individual private companies is likely an untenable practice. Too few companies will be able to figure out how to deal with data access requests, fewer will be inclined to respond to them, and even fewer will understand whether they are obligated to respond to such requests or not in the first place. Instead, Fong argues that application stores — such as Google’s and Apple’s respective App stores — could include comprehensive data access rights as part of the contracts that app developers agree to with the app store owners. Failure to comply with the data access rights aspect of a contract could lead to an app being removed from the app store. Were Google and Apple to seriously implement such a practice then their ability to remove bad actors, such as Uber, from app stores could lead to a modification of business practices.

Ultimately, however, I’m not certain that the ‘solution’ to Uber is better privacy law. It’s probably not even just better regulation. Rather, ‘solving’ for companies like Uber demands changing how engineers and business persons are educated and trained, and modifying the grounds under which they’re rewarded and punished for their actions. Greater emphases on ethical practices and the politics of code need to be ingrained in their respective educational curriculum, just as arts and humanities students should be exposed in more depth to the hard sciences. And engineers, generally, need to learn that they’re not just solving hard problems such as preventing fraudulent rides: they’re also embedding power structures in the code they develop, and those structures can’t just run roughshod over the law that democratic publics have established to govern private behaviours. Or, at least, if they run afoul of the law — be it national data protection law or contract law — there will at least be serious consequences. Doing otherwise will simply incentivize companies to act unethically on the basis that there are few, or no, consequences for behaving like a bad actor.

Footnotes:

[1]: Some of Canada’s provincial commissioners do have order making powers.

--

--