Are You Ready for Self-Driving Cars?
Waymo has already introduced its autonomous ride-sharing car, but motorists might still not be fully prepared to share the road with self-driving vehicles. Autonomous and semi-autonomous vehicles still have hurdles to overcome, including issues with how they perceive and respond to their environment. There is also confusion over precisely what constitutes an autonomous vehicle, and classification systems may be muddying the situation.
The issue may be that carmakers are happy to use language like “autonomous” to build interest in vehicles that in fact are not autonomous at all. It may seem like a small issue, but it could lead to drivers having too much trust in their “autonomous” vehicles, which could result in a tragedy.
The Complications of Self-Driving Cars
The challenges self-driving carmakers face isn’t just getting a vehicle from point A to point B. Getting the cars to propel themselves forward is just the first step: the question is how these vehicles do once they’re on the road and have to deal with other vehicles, road conditions and a variety of obstacles. Self-driving cars won’t just be on the road themselves in isolation; they’re sharing the space with other objects, some of which can be unpredictable and some of which carmakers can’t anticipate.
Stopping for Seagulls
Take the case of the seagulls in Boston. Self-driving carmaker nuTonomy — operated out of the Massachusetts Institute of Technology — tested its cars on Boston’s streets. What nuTonomy discovered is that its cars come to a standstill — and create a potential hazard — when they come across objects that don’t move. Seagulls in Boston remained on the road in front of nuTonomy’s cars because the electric cars are so quiet. With seagulls unwilling to fly away and cars that couldn’t move forward with an object in front of them, a standoff ensued, which could potentially create a significant traffic and safety problem.
In this case, nuTonomy programmed the car to advance slowly to encourage the seagulls to move off the road. Without adequate testing, however, this issue might not have come to light early enough to prevent massive traffic headaches.
Trouble Navigating Weather Conditions
It’s not just the birds causing challenges for self-driving carmakers. Weather conditions, and especially snow and ice, are proving difficult for autonomous cars to navigate. According to a study by the World Economic Forum, snow affects how a self-driving car sees and responds to the road. Sensors might not pick up essential lane markers in bad weather. Cars that are only tested in sunny climates might not be adequately programmed to deal with the poor weather in places that get heavy snowfall or freezing rain. Self-driving vehicles must be proven safe in the various climates that motorists across the United States face. Testing is now underway in cities like Boston — where the snow proved to be a challenge for cars — but thorough testing across various terrains and climates is still needed.
Difficulties with Common Driving Tasks
Reports are emerging that self-driving vehicles face difficulty with everyday driving tasks, including making unprotected left-hand turns (those left-hand turns that don’t have a green arrow to guide them), negotiating some obstacles, and adequately responding to slowed traffic. These are issues that can be dealt with as testing continues, but without proper testing, issues like these will only come to light when the consequences could be catastrophic.
Public Perception of Self-Driving Vehicles
Perhaps one of the biggest obstacles self-driving car companies face is how the public perceives the vehicles. Recent fatal accidents haven’t done anything to improve public perception, although how people feel about autonomous or semi-autonomous cars depends on their exposure to the automobiles and their age.
A study conducted by J.D. Power and Associations in cooperation with the National Association of Mutual Insurance Companies found 42 percent of people said they would never ride in a fully automated vehicle. The same study found that people under the age of 35 who live in urban areas with exposure to ride-sharing services were more likely to accept the new driving technology.
According to the World Economic Forum study, in a survey of Boston residents, those who were most likely to adopt autonomous vehicles were those in the South Boston/Seaport area. Researchers noted this was due to numerous factors, including that people in the district are known to embrace innovation, that there is a lack of access to the subway, and that autonomous vehicles were tested in that area, exposing residents to the vehicles.
The same study found that older people were less likely to adopt autonomous vehicles, with only 21 percent of people aged 66 and older saying they were willing to, as compared with 38 percent of people between the ages of 26 and 45. Researchers theorized this was because older people tend to be less willing to try new technologies and also because baby boomers and people in the previous generation viewed obtaining their driver’s license as providing freedom, whereas younger people have already adopted ride-sharing services and did not perceive obtaining their license as having the same importance.
Is Language the Problem with Autonomous Vehicles?
Some critics point to the language used to describe self-driving and semi-self-driving vehicles as a major issue with public perception, though language might also create safety issues. The problem, critics say, is that language like “autonomous” and “semi-autonomous” is used to describe vehicles across a wide range of abilities, including those that don’t actually self-drive but do have driver-assist features.
Drivers, then, might think their “semi-autonomous” vehicles are capable of doing more than they are. Further complicating matters is that “driver assist” is used to describe a range of features and those features have different levels of functionality, depending on the car’s maker. What “driver assist” means to one car manufacturer, isn’t necessarily the same for another.
The Society for Automotive Engineers (SAE) released a classification for autonomous vehicles, which the National Highway Traffic Safety Administration (NHTSA) has adopted. Critics point to ambiguous definitions as being a significant issue.
Level 0: No Autonomy
Level 1: Driver Assistance
Level 2: Partial Automation
Level 3: Conditional Automation
Level 4: High Automation
Level 5: Full Automation
Reading the category names doesn’t offer a lot of insight into their meaning as it isn’t clear what “driver assistance” means, nor what the difference is between “partial automation” and “conditional automation”. Most privately-owned new cars will fall into either level 1 or level 2, but there is considerable variability into how their vehicles operate and how much automation they actually have. Carmakers using names like “Autopilot” or “Drive Pilot” to describe the features doesn’t clear up the matter at all.
Driver Assist Features
The variability of driver assist features in privately owned vehicles becomes clear thanks to the Insurance Institute for Highway Safety’s (IIHS) analysis of five vehicles with driver assistance features.
Included in the testing were:
2017 BMW 5-series (Driving Assistant Plus)
2017 Mercedes-Benz E-Class (Drive Pilot)
2018 Tesla Model 3 (Autopilot, 8.1)
2016 Tesla Model S (Autopilot, 7.1)
2018 Volvo S90 (Pilot Assist)
Vehicles included in IIHS testing sit at around a Level 2 on the autonomy scale, meaning they assist with steering, speed, and following distance, but require a driver at all times. Testers focused on the adaptive cruise control and active lane-keeping features. Even within those features, however, vehicle performance varies greatly.
Adaptive Cruise Control
Engineers for IIHS also tested the vehicles along four different series examining how the automobiles negotiated stopped lead vehicles, lead vehicles leaving a lane, and acceleration and deceleration. They also used a stopped target to test how the vehicles performed with the adaptive cruise control off and the autobrake turned on.
In the test with adaptive cruise control off and autobrake on, the two Tesla vehicles (Autopilot) hit the stationary target. With active cruise control engaged, four vehicles, including the Tesla, braked earlier and with less force than they had previously. The Volvo (Pilot Assist) braked hard and just 1.1 seconds before impact. All vehicles performed well with a lead vehicle that slowed to a stop, then accelerated. These tests, however, were all track tests. On the road, the vehicles had different results.
An engineer driving the E-Class (Drive Pilot) was required to hit the brakes to avoid collision with a stopped pick-up truck, even though the car had previously detected the truck. The engineer did not receive a warning from the car to apply the brakes.
“At IIHS we are coached to intervene without warning, but other drivers might not be as vigilant,” said Jessica Jermakian, IIHS senior research engineer. “ACC [adaptive cruise control] systems require drivers to pay attention to what the vehicle is doing at all times and be ready to brake manually.”
The Tesla Model 3 (Autopilot) slowed unexpectedly 12 times, seven of which appeared to be for tree shadows, while others were for oncoming vehicles that were in another lane. Although none of these unexpected decelerations created hazardous situations, in heavy traffic with more forceful braking, they could put someone’s safety at risk.
Active Lane-Keeping
It was in active lane-keeping that researchers noted wide disparities between the vehicles. These disparities may be explained by how the various automakers view driver-assist vehicles.
To conduct their tests, IIHS engineers used 18 trials on each of two situations: curves and hills. Testing was performed on open roads. All assisted driving systems work by centering the vehicle in lanes with clear markers, but they can all use a lead vehicle as the guide when they travel at lower speeds or if the system’s view of lane markers is obstructed.
Only the two Teslas performed reliably well in testing on curves, with the Model 3 (Autopilot) staying in its own lane on all 18 tests and the Model S (Autopilot) only overcorrecting once. All other systems required significant driver intervention to navigate curves. On hills, the Model S only stayed in its lane on five of 18 trials. In many cases, it veered into other lanes or on the shoulder and did not warn the driver to take over as frequently as it should have.
The BMW (Driving Assistant Plus) went on the line three times on curves and six times on hills, while the Volvo (Pilot Assist) crossed the line eight times on curves and twice on hills. The Volvo system also disengaged four times on hills.
The IIHS concluded that none of the vehicles was capable of driving safely on its own.
Cadillac Super Cruise Top-Rated in Consumer Reports Testing
In its own testing of driver assist features, Consumer Reports found Cadillac’s Super Cruise to be top-rated, balancing its high-tech abilities with ensuring the driver pays proper attention to the road. Tesla’s Autopilot system was second, Nissan/Infiniti’s ProPilot Assist was third, followed by Volvo’s Pilot Assist.
Consumer Reports tested the vehicles on their capabilities, ease of use and driver engagement warnings, among other factors. In addition to examining how the vehicles’ systems worked, however, Consumer Reports was also concerned about the language some of the carmakers used. Volvo, for example, listed the Pilot Assist system under its “Autonomous Driving” section even though driver assist systems aren’t meant to be autonomous — they require a driver to be engaged.
Volvo responded to Consumer Reports’ concerns by removing language on its site linking Pilot Assist with autonomous driving, but it’s situations like these that lead to confusion with drivers. Reading the owner’s manual might clear things up, but few car owners go to the trouble.
Much like the Volvo in IIHS testing, both the Volvo and Nissan left their lane too frequently. Nissan said the vehicle was designed to do so on purpose to prevent drivers from over-relying on the system, but there may still be considerable confusion about what the cars can do, according to a study by Euro NCAP (similar to the IIHS).
In a survey of more than 1,500 car owners in seven countries, Euro NCAP found that car owners tend to overestimate what semi-autonomous vehicles are capable of. Seventy percent of people surveyed believe it is currently possible to buy an autonomous car, while 11 percent said they would be tempted to take on other activities — such as reading a paper or having a nap — while using a driver assist feature.
That relates directly to concerns that phrasing like “semi-autonomous” misleads consumers into thinking cars are capable of more than they are. The confusion may be that driver assist features are meant to work with a driver who is always paying attention, but by linking them to autonomy, drivers think the cars are capable of doing much of the work themselves. It’s when drivers trust their vehicles too much to avoid collisions that tragedies can occur.
Sources:
https://www.theverge.com/2018/10/11/17960456/self-driving-cars-av-safety-common-language-rand
https://www.insurancejournal.com/news/national/2018/08/09/497549.htm
https://www.nhtsa.gov/technology-innovation/automated-vehicles-safety#issue-road-self-driving, partway down page