A Minor Accident in Tempe is Uber’s Biggest Missed Opportunity to Date
In business schools around the country — at least in the ones that follow the case method — the Johnson & Johnson (J&J) Tylenol Poisoning case is a lesson in the value of transparency, self-awareness, and creative management that can turn a crisis event into an opportunity to lead. By failing to grasp the parallels to its recent Tempe accident, Uber missed a golden opportunity to openly share data about the collision, to right its ethically-listing ship, and to seize the lead in autonomous car safety from competitors.
Johnson & Johnson’s Safety Crisis
If you are not familiar with the history, someone tampered with over-the-counter Tylenol bottles in September 1982 by putting cyanide in the two-piece capsules, resealing the packaging, and placing them back on store shelves in the Chicago area. Seven people died from poisoning. Tylenol was on the nightly news, associated with poison and death. Chicago police drove around neighborhoods using loudspeakers warning citizens not to take Tylenol, and national news anchors warned everyone around the country to do the same. At the time, J&J had a 35% share of the market, with an estimated 31 million bottles in circulation in American homes, valued at over $100 million dollars. Almost overnight, that share dropped to 8%, and J&J was facing a crisis of confidence in one of its signature products. This crisis went to the heart of its medicinal mission: people take medicine for their health. Instead, their signature pharmaceutical product and brand were becoming associated with poison and death.
Personal Responsibility, Crisis Response, & Active Leadership
Johnson & Johnson’s response to the crisis is why the event is taught in business schools. Within a week of the first death, J&J recalled all Tylenol products nationwide. It established relationships with law enforcement and regulators, including Chicago PD, FBI, and the Food & Drug Administration to help localize the source, rule out a factory cause, prevent further tampering, and search for culprit. J&J provided lot numbers, access, and interviews in search of the source. The poisoning was quickly determined not to be J&J’s fault, but people had stopped taking Tylenol because they were naturally worried about their safety. Customers needed a reason to return to the brand other than “someone else was to blame.”
For customers who had already purchased Tylenol, the company offered to exchange capsules for certified-safe solid tablets. In less than two months, J&J reissued Tylenol in triple-sealed tamper-proof packaging — a first for over-the-counter medications — assuring customers that its product security was guaranteed from factory to medicine cabinet. J&J made sure the public knew that this type of poisoning could never again happen to Tylenol, or any J&J product. Tampering eventually became a federal crime, the FDA passed more stringent safety regulations, and capsules were phased out by all drug makers in favor of sealed caplets and solid tablets.
Johnson & Johnson received wide praise in the media for its response, and its market share rebounded within a year. J&J did what it could to address safety vulnerabilities in its supply chain, worked openly with regulators and law enforcement, and reintroduced its product in a way that not only improved Tylenol’s safety but highlighted it as a selling point. The future business managers of tomorrow are supposed to keep these lessons close to heart when they graduate.
Uber’s Wayward Moral Compass
Had Uber CEO Travis Kalanick gone to business school, it is not clear that the lessons from the Tylenol case would have sunk in. Under his leadership, Uber has made numerous high profile mistakes, including ‘greyballing’ (concerted evasion of regulators), sexual harassment & discrimination, denigrating Uber drivers on video, misleading the public about red-light running in San Francisco, openly defying municipal and state licensors, insensitively setting surge pricing during a social protest, and even psychologically manipulating its drivers — to name a few. Calling these ‘business mistakes’ is generous; most are character-revealing, deliberate choices made in a hard-charging existential race for a trillion dollar market.
A business education also would not have solved Uber’s bro culture; if anything, it might have accentuated it. The roots of unethical behavior are often established well before higher education, but the chasing of venture capital D.O.U.S.s (Dollars Of Unusual Size) can steer even the most ethically inclined away from the right thing to do. Treating women as less-than-equal will remain Uber’s worst character failure to date, and may ultimately be its undoing. (See Op-ed in Sunday’s NYTimes on Jerks and the Startups they Ruin.) But character and culture issues aside, Uber’s response to the Tempe crash raises questions about whether it even understands basic business strategy, and the safety mission central to its new model.
The press now portrays Kalanick as if he’s riding his $69B unicorn shirtless, like some kind of Silicon Valley Vladimir Putin. But he doesn’t run Uber by himself; Uber’s board of directors is comprised of bright successful media-savvy people, and officers have their share of business degrees among them. Uber employs top in-house legal counsel who, if consulted, may be either ineffective at dissuading borderline behavior or actively enabling and defending it. Even as the hits keep coming (this week, psychologically manipulating drivers), Uber has widely proclaimed its intent to curb its anti-social instincts and change its “hard-charging attitude” going forward. So how did Uber collectively miss their biggest ever business opportunity in the Tempe Arizona accident just last week?
A Tempest in Tempe
On March 24th, one of Uber’s highly automated Volvos collided with another car in an intersection in Tempe Arizona, causing Uber to hit a light standard, flip onto its side, and hit other vehicles further down the road. Thankfully injuries to all parties were described as minor, but for Uber, the viral optics of its tech-topped Volvo on its side were disconcerting, to say the least.
Before the day was out, Tempe police quickly and publicly laid legal blame on the human driver turning left across three lanes (actually four, if you count the left-turn lane) of oncoming traffic. She even received a citation for the vehicle code infraction. Because the Uber had the legal right of way and the Uber didn’t exceed the posted speed limit, the actions of the Uber driver — and even the identity of the Uber driver (human or robot) — became irrelevant for legal purposes. The case was closed, as far as the Tempe police were concerned. The press unanimously placed fault on the turning human driver. Wired magazine said the incident was proof that we need autonomous vehicles for safety.
What was Uber’s response? Stay mum. On the night of the collision, it confirmed only that there were no passengers in the back seat of the car, and that it had removed all of its self-driving vehicles from the road to conduct an in-house investigation. That was all. Some questioned why Uber decided to pull all of its self-driving Volvos from California, Arizona, and Pennsylvania if it had no fault for the collision. Either it believed it did nothing wrong, or it knew it had a safety issue to address.
The police and press immediately judged it to be the former, and Uber appeared to not want anyone think otherwise. Uber returned its San Francisco cars to the road 48 hours later, and its other cars shortly thereafter, making no statement about whether it had identified or corrected a safety issue in its programming.
This was the worst possible response Uber could have taken.
Questions Raised, Uber’s Lack of Transparency
A week after the accident, the final police report was released, and questions about the proximate causes of the accident (and, consequently, Uber’s business decisions) are growing.
- Did the Uber accelerate, as an independent witness said, to beat the changing yellow light?
- Why did the Uber enter an intersection blindly at 38 miles per hour in Lane 3 when it knew traffic was completely stopped in Lanes 1 & 2?
- Was the Uber actually in autonomous mode?
- What could the Uber car and driver actually see, and how much reaction time did he have? Was there a handover delay?
The unbiased, objective answers to each every one of these questions — and others — is within the care, custody and control of Uber alone. Had the accident occurred in California, Uber would have to disclose the accident data. In Arizona? “What I’ve shared to date is all we are sharing on our findings,” according to Uber spokesperson.
Now it appears as if Uber has something to hide; namely, potential shared fault for a minor car accident — at worst. So why hide the ball? The coverup is always worse than the crime. Why make this minor Tempe accident the latest in a long line of uncooperative, anti-social actions, when Uber could apply the lessons of the J&J Tylenol Poisoning case and take the lead from 35 other companies on transparency, accountability, government cooperation, and autonomous vehicle safety?
Uber’s Strategic Failure: It is No Longer a Software Company.
Uber’s CEO correctly recognized that autonomy is existential to the survival of Uber. (Any company that perfects autonomy can put Uber out of business by undercutting its labor cost.) Unfortunately, Uber still thinks of itself as a software company; a platform for transportation providers. What Kalanick and Uber fail to understand is that autonomy puts them squarely in the health & safety business, just like Johnson & Johnson — and it must act accordingly.
People take Tylenol for a headache and minor aches and pains. But if doing so risks death by poisoning, there are plenty of other over-the-counter medications that solve the same problem, including ibuprofen and aspirin. When J&J was in danger of losing the public’s trust through no fault of its own, they gave the public a reason to return to Tylenol: J&J decided to make it the safest.
Similarly, people take Ubers for minor reasons: to ease the headache of getting from A to B conveniently. But if Uber AVs might result in a fatal crash every time a human driver or pedestrian does something technically illegal (which occurs constantly), people are going to take other modes, including human-driven Lyfts, or public transportation. In survey after survey, people are wary of self driving cars precisely because of concerns about safety. Unfortunately, neither Uber, nor Tesla, nor any other AV manufacturer has attempted to meet — let alone exceed — its customers’ expectations and concerns about safety.
When companies like Uber and Tesla hide their accident data from their own customers and people with whom they collide, they are demonstrating that they don’t actually believe in safety first; they believe in development first — at all costs, including safety. Every company that has thus far had the opportunity to seize the lead on safety has passed: Telsa blamed Josh Brown for his own death while he relied on Autopilot, and Tesla patted itself on the back when NHTSA did too. Perhaps Uber thinks Tesla came out ahead on that accident, because Uber seems to be following the same PR playbook. Both fail to recognize that at this stage of the autonomy race, nothing matters more to future customers than their health and safety.
What Uber Should Have Done (& Can Still Do)
Using the lessons from the Tylenol Poisoning case, Uber should have done the following:
- First, pull all the cars. If one has a problem, they all do. Such is the nature of cars driven by software. Johnson & Johnson pulled all of its Tylenol products nationwide, even though authorities quickly localized poisonings to the Chicago area. The fear of copycats was real, and just because the only deaths had occurred in Chicago did not mean the rest of the country was safe. Identify the problem and don’t release the cars until they are all reprogrammed and can respond to the same conditions safely. In this case, close the delta (speed difference) between the Uber and adjacent traffic. Do not allow the car to enter blind intersections at high speed.
- Share the accident data. An accident is a public event, on a public street, attended to by public officials, police, fire, ambulance, and other first responders. The public has a compelling interest in making sure drivers are safe on the road — particularly when the driver is a piece of software someone wrote on their own, for profit, without any testing data to prove its safety. There is zero benefit to anyone in withholding objective accident data from authorities and involved parties. The only reason to hide objective data is to hide Uber’s own contributory negligence. Even without Uber’s cooperation, the truth will come out eventually. Someone will leak the video. The engineers will speak. The witnesses will testify. Traffic cameras near the intersection may have picked up the event. Someone’s dash-cam may have recorded it. Everyone’s cell phones can probably confirm speeds and directions of parties and nearby vehicles. Uber should publicly offer the data to competitors so they can respond safely in similar conditions. Allowing other manufacturers to harm or kill when Uber can use the lessons learned to help them prevent it would be unconscionable.
- Own up to its share of responsibility for this minor collision. If the self-driving program sent it blindly into an intersection at 38mph, that is unreasonable. A reasonable driver would slow as it recognized all other lanes were blocked. The first way to seize the lead on safety is contrition: admit that although they were found not legally at fault, Uber believes it could have taken greater defensive action prior to the collision and will actively take steps to reprogram the cars to account for such conditions. There is no better way to demonstrate leadership than to accept partial responsibility for a minor accident when the press and public hold you blameless. Challenge other manufacturers to do the same.
- Apologize to the woman blamed and settle any civil liabilities publicly and amicably with her. She acknowledged what she did, and her legal violation is undeniable. Right now Uber has allowed her to be thrown under the bus by herself— perhaps undeservedly. If Uber shares blame, it’s 50/50 at most. Take the minor liability and PR hit because hiding the ball is so much worse. Right now, the left-turning driver could seek counsel who might sue Uber and serve discovery to obtain the accident data. Uber could be legally compelled to turn over damning evidence. Uber can prevent that from happening by simply doing the right thing.
- The accident happened in part because the intersection is poorly designed. An unprotected left against 3 lanes of oncoming traffic (4, actually, if you include the left turning lane) is a trap. Left turning cars effectively have to wait until the light turns red before they can go. Even then, they have to be certain oncoming cars will not run the red. The intersection should allow left turns to be made on green arrows only. Uber is in the very best position to use its millions of miles of data to identify potential trap intersections like this, and other hazardous infrastructure. Announce a street safety program to work with municipalities and states to identify and correct dangerous intersections and conditions, create separated bike lanes, install clear signage and new striping. Find potholes and report them in real time so they can be repaired. Be an ally to cities, pedestrians, cyclists, and the human driver.
- Take the lead in suggesting new safety legislation. For example, if accidents like Tempe are speed related, lobby for speed reductions in residential and dense commercial districts. In places where traffic routinely exceeds posted speeds, don’t try to keep up: help authorities keep speed down. You are in the safety business now, and two thirds of all deaths are speed related. Slower speeds are safer for autonomous cars as well as human drivers. Speed enforcement cameras are already common on Arizona freeways. They work. Share data when you observe other parties in accidents or behaving dangerously. Be a hero: catch bad guys.
- Give customers a reason to come back to Uber, other than “it was the other driver’s fault.” Announce a new safety pledge, promising to share data from every collision with the public and not to allow any car on the road that hasn’t been upgraded to address the most recent failure or omission. Be the one to challenge other autonomous car manufacturers not to repeat the same mistakes Ubers make (and make only once, hopefully). Passengers and pedestrians don’t care about legal fault — they don’t want to be in an accident in the first place.
Driving is a highly social interaction; we take for granted the number of times we non-verbally communicate with and save other drivers from their own mistakes. Thanks to anticipation, self-preservation instincts, and empathy, it usually takes a combination of more than one simultaneous error in the same physical space for two cars to actually collide. In this case, it was likely three causes: the unprotected left turn, the unreasonable speed of a pre-programmed car into a blind intersection, and the dangerous trap of an unprotected left turn against 3 lanes of oncoming traffic. Instead of denying its contribution, Uber should own it — and take responsibility for the infrastructure cause as well. Uber is in the safety business now whether they know it or not. It must provide riders a reason to ride in its cars other than “someone else was at fault.”
The public and press are captivated by the prospect of self-driving cars, turning even minor car accidents into international news. Everyone wants to know if driverless cars are safe and worthy of trusting with our lives (and our children’s & parents’ lives). And though every AV manufacturer is touting safety as its number one priority and selling point, none of them seem to appreciate that they, like Johnson & Johnson, are now in the health & safety business. Autonomous cars are the capsules we climb into, rather than put in our mouths. Uber appears not to grasp the parallels to the Tylenol case from 35 years ago. Hiding from responsibility is the worst action manufacturers can take — and perhaps the hardest lesson for software companies to learn.
The Tempe collision highlights perfectly the difference between legality & safety, between legal & moral responsibility, between transparency & evasion. Uber’s response was a failure on all counts. Thirty-five years from now the Uber-Tempe case might be taught as a companion to the Tylenol Poisoning case. The only question is whether it will reflect a second example of steady corporate crisis management or a cautionary tale about the consequences of doing the exact opposite.