Illustration: R. A. Di Ieso

If A Self-Driving Car Gets Into An Accident, Who Is To Blame?

This could be the end of liability as we know it

Vocativ
6 min readJul 22, 2016

--

By Alexandra Ossola

In February of this year, a self-driving car crashed into the side of a bus outside of Mountain View, California. The incident was a landmark: It was the first time an autonomous vehicle had caused a crash, not just been a part of one.

The Google-made vehicle was driving along El Camino Real in California when it sensed sandbags positioned around a storm drain and swerved into another lane to avoid them. Within seconds, the vehicle crashed into the side of a public transit bus. The details beyond that are, thankfully, banal — both vehicles were moving slowly and there weren’t any injuries.

Though self-driving cars are engineered to avoid some of the reckless mistakes attributed to human error, it’s inevitable that accidents will happen. Now, before autonomous vehicles become commonplace on the roads, manufacturers, engineers, and legal experts are figuring out just who — or what — should be held responsible for mistakes made on the road. Car manufacturers will accept responsibility for more (though not all) crashes than they have in the past, but the biggest changes will likely occur on the legal side, as lawmakers will have to reinterpret liability laws.

“Driving is one of the most complex activities that we as humans undertake, and it’s dangerous as well,” says Raj Rajkumar, a professor of electrical and computer engineering at Carnegie Mellon University. “We process information at high speeds, and we make realtime decisions. The number of scenarios is endless. It’s a very challenging problem [to engineer].”

From the earliest phases of their creation, autonomous vehicles are designed to be safe and cautious. Sensors — cameras, RADAR, lasers, ultrasound — are positioned on various parts of the car. The information they take in about their surrounding environment is then fed into pre-programmed algorithms and the car moves accordingly depending on the environmental conditions or even the nature of the obstacle. “Our car can distinguish between a human and a garbage can,” Rajkumar says. “It will drive around garbage can. But a human would freak out if it did that, so instead the car will stop and wait for the human to cross the street.”

But things can still go wrong. And if an accident does happen, there is no standardized rule for who should accept responsibility. Who should pay up depends on various factors — whose fault the accident was, the state in which it took place, if the autonomous car was purchased by a consumer or just for testing, the manufacturer of the car, and so on. There could be one liable party or multiple liable parties.

In that regard, the liability system won’t have to change much to adapt to self-driving cars, says Bryant Walker Smith, assistant professor of the School of Law at the University of South Carolina. Thanks to a series of laws governing liability, called tort law, drivers can currently seek compensation from a manufacturer for a faulty airbag or a defective tire. “That is still true with automated driving — was the vehicle designed reasonably, and did it result in reasonable behavior?” Smith says.

If it’s not, then consumers have a case against the manufacturer. A few autonomous vehicle manufacturers, such as Google and Volvo, have made it clear that they will accept responsibility for any accidents in which their cars are at fault. But that’s not as groundbreaking as it might seem — it’s really only a “recognition of reality,” as Smith says, since the manufacturers would be held liable in court anyway. The announcements mostly serve to set consumers’ minds at ease.

Even if auto makers are willing to take on more responsibility, car owners will still need individual insurance. Some of the non-collision related risks for personal vehicles, such as hail and graffiti, aren’t going away, as Smith points out. One insurer in the United Kingdom announced in June that drivers of autonomous vehicles who are drunk would not be covered, which is a shift since many current policies in the U.S. currently do cover drunk drivers.

Though some in the insurance industry think collision insurance may change, there hasn’t yet been any great push to change the existing insurance structure. Car manufacturers would be paying for a greater percentage of crashes, but there would be fewer of them — “a bigger slice of a smaller pie,” according to Smith. “Some things will change — the risk factors, who the driver is. But insurance, that will continue to exist.”

One thing that will likely change, though, is how legislators interpret existing laws to determine blame and culpability. Most transportation laws were written with the assumption that a licensed driver is operating the vehicle, according to a spokesperson from the National Association of Insurance Commissioners (NAIC). Some states, such as Michigan, are considering a bill to require self-driving cars to pass a licensing test before they hit the road. Other factors will require new laws altogether, such as ensuring a vehicle’s safety before it starts driving and how to check that it’s being properly maintained.

As of now, much of the regulatory framework is piecemeal, varying by locality or simply nonexistent. Some legal experts and manufacturers have begun to suggest regulation on a federal level, Smith notes. But if history is any indication, many of these policies at all levels of government will shake out as autonomous vehicles become more common and legislators are presented with new scenarios.

As was seen with the mass-scale adoption of the automobile in the 1960s and more recently with widespread use of ride-sharing apps like Uber, this process won’t be seamless.

There will be situations that fall between the cracks in which morals come to the forefront. For example, if a car had to hit either a pedestrian, endangering her life, or a tree, endangering the driver’s life, which should it hit? What if there were 10 pedestrians? What if the was a child in the car? Weighing these factors is nearly impossible for a driverless car, in part because society in general is far from a consensus.

Engineers like Rajkumar intend to anticipate as many of those situations as possible. And that might mean driving will look a lot different when it’s done autonomously. Cars will move more responsibly, which might mean they simply move more slowly, Smith suggests.

For now, self-driving cars have some clear limitations. They can’t drive so well in bad weather, like snow or fog, because the precipitation clouds its sensors. Their GPS and mapping algorithms aren’t 100 percent accurate. And fast-changing road conditions can still mess it up — as was the case in February’s fender-bender. “The technology has a lot of potential, but we’re not there yet,” Rajkumar says. He predicts that it will get much more sophisticated within the next decade.

As it does, drivers and society in general will have to decide when the technology is ready to deploy. “One of the real fundamental questions is: How do we know when they’re ready? How safe is safe enough? ” Smith says. Regulators will need to come up with metrics for the safety. But when they do, drivers have to decide just how much risk is worth taking when it comes to self-driving cars.

“How much risk can we take in accepting systems that are not perfect, maybe not great, to prevent the carnage we have on roads today?” Smith says.

This story originally appeared on Vocativ on Jun 28, 2016.

Follow us on Facebook, Twitter, and Instagram.

--

--