Move fast and break (other people’s) things: The Uber fatality and Facebook data breach

Jaime Rodriguez-Ramos
5 min readMar 22, 2018

--

The Uber autonomous car fatality and Facebook data breach are signposts of Big Tech hubris. Big Techs don’t take into account the broader society’s resources, interest, and rules. They move fast in their race for super-profitable monopolies even if they break other people’s lives or privacy in the process. Big Tech companies should learn from their mistakes and change their attitude to avoid regulation or antitrust action.

Apparently, the Uber autonomous car fatality and the Facebook data breach do not have much in common. However, on closer examination, they are both indicative of the Big Tech companies hubris and arrogance that make them think they are beyond working together with the broader economy or following regulations. Breaking things is ok as long as you move fast to win the race and other people are hurt with the breaking. Let’s examine each incident in turn.

Uber autonomous car fatality

March 20th, 2018 might pass on to history as the first time an autonomous robot killed a person without human intervention. It was not the Terminator or Agent Smith, the footage looks like many other traffic accidents. A badly illuminated road at night, when suddenly a figure appears too fast for the car to stop. All is over in less than a couple of seconds.

First, my condolences to the loved ones of the pedestrian. Traffic accidents are a sad and sudden way to lose someone. A single second can make the difference between life or death. According to the WHO, there are more than a million traffic-related deaths per year, one every 25 seconds, a large number of which are in the developing world. Even in the US, traffic injuries are the top 1 or 2 accidental causes of death for almost all demographics.

The causes of the accident are clear and seem to exonerate Uber from the traditional perspective. Looking at the video the accident, it seems it would have been hard to avoid for any human driver. The failsafe driver in the Uber only saw the pedestrian approximately one second before impact and could do nothing. The pedestrian was crossing at night in the middle of a busy road with no illumination within 100m of the closest pedestrian crosswalk. A human driver would have probably have ended with the same outcome, and likely not considered at fault based on the police chief statement.

However, from another perspective, the accident is inexplicable and inexcusable. Given the power of technology for focused and precise action with all available information, it seems negligent to have this type of accident occur. The investigation will ultimately determine if the systems didn’t pick up the signal or there was another deeper failure. However, it is clear that there are simple ways in which the pedestrian could have been detected. Couldn’t Uber be informed of what smartphones are nearby? The pedestrian was most probably carrying a smartphone and her carrier could have relayed that information to the Uber car, at least making it pause and slow down due to the undetected obstacle. Hasn’t some glitch like this happened before to some of the other autonomous car companies? It probably has, but each company is developing the technology in secret to try to outmaneuver the others so best practices are not shared putting lives in danger.

Big Tech companies are competing amongst them in a race to make autonomous vehicles possible. The goal is laudable as it could drastically reduce the more than 1 million deaths a year and the untold suffering traffic accidents cause through death and injury. However, the race is being run for profits, so there is little collaboration with other players that could help like telcos or with the other participants in the race. Given that human lives and human safety are at stake in the race it should be run with society in mind. It would take a change of attitude, from Big Techs being focused on winning and creating super-profitable monopolies, to being focused on winning against traffic deaths and drudgery to create a better world. Their current attitude will be self-defeating in the long run, and the win-win attitude would be rewarded by regulators and the public at large.

The Facebook data breach

Mark Zuckerberg is famous for having a poster in the Facebook offices that says “Move Fast, Break Things”. That is an embodiment of the hacker and agile culture and something very commendable when you pay the breaking your own money. However, when you move fast and break things with your users’ data and trust “Facebook made mistakes” sounds like an understatement. Especially, when that data breach was allegedly used to change the results of a US election (admittedly not in a way that Zuckerberg or most Facebook executives would have wanted) and was kept secret and happening for almost 4 years.

First, let’s understand what is the data breach. Apparently, Facebook app developers could download and access detailed information about their users and their friends. As usual, it required user permission for the app to access that information. That permission was construed to mean that all the information could be downloaded and mined without limit and outside the context it was originally granted. Cambridge Analytica, the company at the center of the scandal, created a seemingly innocuous personality test app called “thisisyourdigitallife” that promised to read out your personality if you gave it access to all the data. The company then downloaded all the information from the users and their friends and used it to fuel its different data mining campaigns.

The data breach was really no data breach, as Facebook willingly handed the information wholesale to their developers. Facebook afterward took no responsibility for what the developers did with the information and didn’t audit or police it in any way. Cambridge Analytica has gone out in the open because of its signature role in the US election, but there are probably hundreds of other developers that were mining personal information from Facebook less high profile direct marketing and sales efforts.

Facebook has only apologized when the pressure has mounted and the proposed solutions are to clamp down on developers. Maybe Facebook is not legally at fault and has all the necessary legal permissions from its users. However, this highlights that the current system gives unregulated powers to firms like Google and Facebook’s in terms of handling information. Their business models are predicated on capturing untold amounts of data about their users and then turning back and selling it to advertisers or however is willing to pay. Should they be able to accumulate information without regulation or responsibility? Do we want information that can turn an election to be sold indiscriminately and unaccountably to the highest bidder?

Facebook is another example of a big tech that needs to mend its ways. It needs to take at heart its users interest and not just the amount of advertising dollars. Information should only be sold with user consent and with the utmost care. Users should be informed about the economic transactions that are happening with their data and maybe get a cut from it. Data portability should be available, making sure the data belongs to the user and the user can migrate to a different service without losing all his or her history. Number portability was necessary to create a competitive telecom marketplace and user data portability will be necessary to limit the internet monopolies.

--

--

Jaime Rodriguez-Ramos

Impact of exponential technologies on society and business.