Thoughts on California’s New Self-Driving Car Regulations

Jesse Krompier
Self-Drive Central
Published in
6 min readMar 16, 2017

This past Friday, California DMV published Proposed Driverless Testing and Deployment Regulations for fully autonomous vehicles, allowing manufacturers to legally test self-driving cars on California roads without a human driver sitting behind the wheel.

The proposed regulations are an important step forward for manufacturers who now have a more specific set of guidelines to test and deploy autonomous vehicles. By clarifying the requirements for autonomous testing and deployment, the proposed regulations increase the certainty with which manufacturers can develop self-driving technology.

Still, some important questions remain unanswered and should be addressed by policymakers at the April 25, 2017 hearing in Sacramento to finalize the new rules.

Autonomous Mode vs. Conventional Mode: A Liability Question

Previous drafts of the regulations defined “autonomous mode” as a type of “vehicle” that is “driven without active physical control by a natural person sitting in the vehicle’s driver’s seat.”

The new regulations define “autonomous mode” as,

the status of vehicle operation where technology that is a combination of hardware and software, both remote and on-board, performs the dynamic driving task, with or without a natural person actively monitoring the driving environment. § 227.02(a).

“Conventional mode” is defined as,

the vehicle is under the active physical control of a natural person sitting in the driver’s seat operating or driving the vehicle with the autonomous technology disengaged. § 227.02(d).

By defining “autonomous mode” as a status, the regulations address an important aspect of self-driving technology: autonomous vehicles could come packaged with two modes — “autonomous” or “conventional.” As such, a human driver could disengage autonomous mode and take control with a steering wheel and set of pedals.

A dual-mode autonomous vehicle is appealing to consumers who might want the freedom to drive at their discretion. It is also appealing to manufacturers who can sell cars capable of driving beyond specially designated driving zones with human assistance.

It also presents interesting questions of liability:

In the event of an accident in autonomous mode, could the human driver have taken control? If so, how much liability should we place on the driver?

A fully autonomous vehicle, on the other hand, may be more limited in its driving range and capabilities, at least in the early stages of self-driving technology. For example, if a tree branch falls on the road, will the fully autonomous vehicle know to cross the double yellow line to get around it, or will it sit in the lane and wait for someone to clear the obstruction? Will it practice defensive driving, i.e., swerving into the shoulder when the car next to you changes lanes without looking?

Questions of liability, however, are “easier” for the fully autonomous vehicle in the sense that liability should be placed on the manufacturer when the human has virtually no control of any dynamic driving tasks. This is additional motivation for manufacturers to produce cars that come with a “conventional mode,” so that they can avoid this obvious presumption of liability when accidents occur.

Passengers Cannot Pay for a Test Ride

A “passenger” may “summon a vehicle or input a destination,” but does not monitor the vehicle’s dynamic driving tasks, like steering, acceleration, changing lanes, or parking. Notably, the definition of “passenger” also includes the following:

A member of the public may ride as a passenger in an autonomous test vehicle if there are no fees charged to the passenger or compensation received by the manufacturer. § 227.02 (j).

This appears to be a direct jab at Uber, which attempted a rather early release of its autonomous car service in San Francisco in December 2016. In addition to lacking the necessary permits, the Uber vehicles reportedly ran at least six red lights during the short-lived experiment. For now, it appears the DMV hopes to discourage autonomous ridesharing until the technology has been fully vetted.

“Uber testing self-driving car” (CC BY 2.0) by zombieite.

For those of you who know someone working to develop autonomous vehicles, here is your chance for a free ride. Manufacturers cannot charge you for riding as a passenger in their test vehicles.

But don’t get too excited. Manufacturers must comply with a series of stringent requirements to test their vehicles, which includes driving them within predefined zones and notifying local authorities of their whereabouts. In other words, free-ranging rideshare services are not yet available for the general population.

Mandatory Requirements to Test Driverless Vehicles: Who is “At Fault”?

Among the many conditions required to obtain a testing permit for driverless vehicles, manufacturers must do the following:

  • Hold a minimum $5 million insurance policy;
  • Coordinate testing locations and conditions with local authorities;
  • Instruct law enforcement on how to interact with the test vehicle and safely remove it from the roadway in case of emergency, and
  • Provide a two-way communication link between any passengers and a remote operator who must continuously monitor the car’s status.

Further, the manufacturer must agree to assume complete liability for any damage causedto the extent that the autonomous vehicle is at-fault in any collision.

This seems to leave a fair amount of wiggle room for the manufacturer. For example, an autonomous vehicle that suddenly brakes to avoid an obstruction could cause a rear-end collision, whereas a human driver would otherwise brake less suddenly to reduce the risk of such a collision. In this case, is the driver following the autonomous vehicle completely at fault? Or should the manufacturer assume some responsibility for programming overly rigid braking rules?

What if a child jumps into the middle of the road, and the autonomous vehicle swerves and hits a dog in a crosswalk? Is the autonomous vehicle “at-fault” for the dog’s injuries? Should the manufacturer have written “safer” code to prevent this type of accident, and how can we possibly determine that?

There should be mechanisms in place to determine fault when accidents arise. But there’s a problem:

Determining fault may require access to an autonomous vehicle’s proprietary machine learning algorithms, which manufacturers have valid reasons to protect.

(Remember the Apple iPhone case?)

Lawmakers should consider when and how to mandate access to proprietary data and work with manufacturers to establish a clear set of rules. By doing so now, they will save a lot of time, energy, and money when thousands of autonomous vehicles rule the streets.

What About Cyber?

The regulations are missing at least one important element: testing for cyber-attacks. Cars with computers can be hacked, and it’s not hard to do. In 2015, two hackers remotely turned off a Jeep while it was cruising on a highway, and that was not an autonomous vehicle.

Autonomous vehicles will be more reliant on computer systems than ever. That makes them more vulnerable to cyber-attacks than conventional cars. Further, a cyber-attack on an autonomous vehicle could be a lot worse. Because autonomous vehicles must communicate with each other using vehicle-to-vehicle technology, an attack on one car could compromise an entire fleet of 100 autonomous vehicles.

Meanwhile, the proposed regulations for a Permit to Deploy Autonomous Vehicles on Public Streets require only that a manufacturer submit a “certification” that its autonomous vehicle has met industry practices for cyber security:

A certification that the autonomous vehicles have self-diagnostic capabilities that meet current industry best practices for detecting and responding to cyber-attacks, unauthorized intrusions, and false or spurious messages or vehicle control commands. § 228.06 (a)(9).

Industry best practices for cyber-safety have been published by the NHTSA (National Highway Transportation Safety Administration) and Auto-ISAC (Automotive Information Sharing and Analysis Center).

Self-certification may not be enough, especially because manufacturers are rushing to be the first to hit the road with a fully autonomous vehicle. Lawmakers should work with manufacturers to devise a rigorous series of tests which must be satisfied before a Deployment Permit is issued. This may seem burdensome, but it is a necessary cost to ensure a safe and happy public introduction to this wonderful new technology.

--

--