It’s Not All Smooth Travels for Self-Driving Cars

Wisner Baum
8 min readFeb 8, 2019

--

A self-driving vehicle that follows all the rules of the road might sound ideal in theory to many drivers. After all, you’re not likely to get a speeding ticket if your car can’t speed. And you won’t get a ticket for blowing through a stop sign if the autonomous car stops for three seconds at all stop signs. For a patient passenger, an autonomous vehicle that follows all the rules is a blessing. But for other drivers on the road, they can be frustrating.

As autonomous vehicle companies work out the kinks in their systems, new technologies and procedures are being developed to address some driving issues. Cars have to be adapted to deal with obstacles on the road, for example. And autonomous car companies are unlikely to develop cars that break the rules — they should always stop at a stop sign — as failure to do so could lead to lawsuits. They also need to follow the rules of the road because one of the main selling points is the increased safety.

Meanwhile, companies testing autonomous vehicles have run into various problems, including snow, seagulls, and desert storms. These issues highlight the ways autonomous car technology can fail, and why constantly upgrading autonomous vehicle technology is so important.

Why Safety is Key for Autonomous Vehicles

One of the main benefits often cited for the switch to autonomous vehicles is their safety. Their use eliminates an important automotive hazard: driver error. According to reports, more than 90 percent of car accidents are blamed on human error. Furthermore, in 2016 the top three causes of car crashes were distracted driving, drunk driving, and speeding. All of these issues can be decreased with autonomous vehicles, manufacturers say.

While self-driving cars are looked at as a way to dramatically lessen traffic fatalities, autonomous vehicles aren’t without their issues.

Self-Driving Car is Honked at Twice in Four Minutes

In September 2018, David Booth took a ride in a Waymo Chrysler Pacifica. As he found out, Waymo’s self-driving cars are tightly bound to the rules of the road. They not only come to a full stop at a stop sign, they wait a full three seconds before pulling forward, an agonizing wait for people who are used to stopping, looking around for traffic, and moving again quickly.

Booth writes his vehicle was honked at twice in a four-minute trip. He also notes issues with left turns — the Waymo slows partway through the turn — and merging into heavy traffic.

The issue isn’t so much the rule following. After all, the safety rules are there for a reason. It’s that the Waymo cars don’t drive themselves the way most humans would, which can cause issues on the road. Consider the driver who drives under the speed limit. That person might not be breaking the speed laws, but they are creating another hazard in how other motorists react to them. Autonomous vehicles that follow the rules too closely could also create their own hazards, by encouraging — or even forcing — other drivers to do something dangerous to react to the autonomous vehicle.

It’s not a stretch to see such situations. Motorists not only monitor what they’re doing on the road, they watch ahead to anticipate how the vehicles in front of them will react. If they see a child about to run onto the street in front of the car ahead of them, they can hit the brake to avoid rear-ending that car. If an autonomous car stops for no apparent reason, the vehicles behind are less likely to anticipate a stop, creating the potential for a collision.

This sort of scenario has played out, repeatedly. According to TheInformation.com, Waymo vehicles have great difficulty navigating the T-intersection near Waymo’s Phoenix headquarters. Those challenges have reportedly already almost caused accidents when the Waymo vans stopped suddenly while making right turns.

“I hate them,” said one driver of Waymo’s vans.

Routine decisions that humans make might be difficult for sensors and computers to anticipate. A human, for example, might see a smaller gap between cars as acceptable to make a left turn through, while an autonomous car might wait for a larger gap. That creates more congestion on the road, frustrating other motorists.

Humans and Computers Sharing the Roads

A significant issue is that autonomous vehicles share the road with human drivers. Each group uses its own logic to make decisions on the road, but that logic is sometimes at odds. As Booth notes, Waymo’s autonomous cars rely entirely on onboard sensors and logic, which means the vehicle has to wait to “see” what another vehicle on the road does rather than having knowledge of what will happen thanks to vehicle-to-vehicle communication.

Vehicle-to-vehicle communication is what goes on between platooning semi-trucks. Those trucks are all “connected,” so when the truck at the front of the platoon — a group of three or more semi-autonomous semi-trucks driven closely together — applies the brakes, the others do immediately. There is no reaction time, because the artificial intelligence reacts to the brakes being applied, not to seeing and processing the truck slowing down.

For now, however, at least for Waymo, all information processed by an autonomous vehicle’s computer comes from how that vehicle’s sensors see things happening, not from information sent by other vehicles.

Speaking with Booth, Amir Efrati, from TheInformation.com, compared Waymo autonomous vehicles to student drivers taking their first driving test. Not exactly a glowing review.

How the Technology Works (and How it Fails)

In order for fully- or semi-autonomous vehicles to be safe and reliable, the vehicle’s technology must understand the area around them, and the other motorists. This allows them to stop at traffic lights and avoid running over cyclists. They do this by combining information from a variety of sources, including radar, LiDAR and cameras. Computers in the car make sense of the information sent to it by the sensors and tell the car how to react.

Sometimes, unfortunately, the technology fails. In one notable accident, the glare of sunlight prevented an autonomous car from seeing a white truck pulling out from a side road onto the highway. The vehicle crashed into the side of the truck, killing the autonomous car’s occupant (thanks to a lack of side guards on the semi-truck, the autonomous vehicle passed underneath the truck and continued driving until it moved off the road, near a pond and through some trees).

In other cases, people have figured out how to confuse autonomous cars. The vehicles can be made to ignore a stop sign, for example, if someone sticks black and white labels on the signs.

It’s these unusual events that autonomous car companies have to plan for, if they want fully autonomous vehicles to become widely used. The Insurance Institute for Highway Safety released a report warning that in some cars, assistive technologies can fail with potentially fatal consequences. Those failures included hitting stationary objects that the vehicle should have seen, crossing lanes improperly, and slowing down unexpectedly.

Self-Driving Technology Companies Promise Things Will Get Better

Waymo has said its vehicles learn and adapt to situations and that as feedback comes in about its rider program, the technology is being improved. Currently, the vehicles have a remote safety operator and there have been situations in which the operator stepped in to take over driving duties.

In a statement to Mashable, a Waymo spokesperson wrote, “Waymo was founded on a mission to make our roads safer, and that’s why we built a cautious and defensive driver. The way to responsibly deploy our fully driverless technology is to robustly test and validate in a geo-fenced territory that grows over time.”

For its part, Tesla has worked at giving the artificial intelligence that runs autonomous vehicles even more information to enable it to get better at making decisions. This means adding many sensors to vehicles and collecting all the useful data so the artificial intelligence can be better trained. As the artificial intelligence becomes more sophisticated, the thinking goes, the vehicles will require fewer and fewer sensors because the cars can be upgraded with software updates, rather than new technology.

Problems not Limited to Waymo

Waymo might be among the more high-profile companies to push autonomous vehicles, but it’s certainly not the only one experiencing technological difficulties. Bloomberg reports a company called NuTonomy has also experienced problems with its vehicles, especially related to how they handle snow and birds on the road.

Snow apparently changes how the cameras on NuTonomy’s vehicles perceive the street. The vehicle also had difficulty with seagulls that stand on the street and do not respond to the autonomous cars because the cars are so quiet. The cars, however, would stop because they saw the birds on the road. With stopped cars and birds refusing to move, traffic came to a halt.

NuTonomy managed to find a work around for the birds — the cars now move forward slowly to push the birds into flying away — but there has been no such fix for the snow.

Weather might be the biggest hurdle for autonomous vehicles. How the snow and ice affect cameras and sensors is still being determined. What is known is that in foggy weather, cameras don’t work, and raindrops and snowflakes affect LiDAR lasers. GPS and radar have their own issues (GPS: poor connections, radar: difficulty identifying objects), not necessarily related to weather. S

Autonomous vehicles make decisions based on the best information from various systems and learn to recognize information that should be ignored. A company called WaveSense says its ground-penetrating radar can help keep cars on the road even in poor weather, but its products haven’t been widely adopted. They have, however, received research and development funding from the U.S. military.

Until self-driving cars can handle snow and cold weather, they might be mainly used in cities that are mostly sunny and dry, places where they are the most reliable. Even sunny climates, however, have their challenges. Nuro’s testing on self-driving delivery vehicles in Phoenix was stopped by a desert dust storm.

Should People Adapt to Autonomous Cars or Should Technology Mirror People?

Andrew Ng, a thought leader on artificial intelligence, believes pedestrians should be made more accountable for their actions. In his thinking, if a person jaywalks on a street and an autonomous car doesn’t see that person in time, it’s the person’s fault, not the car’s and not the technology’s. Not all experts agree with Ng, however. After all, autonomous vehicles are supposed to make driving safer on their own, not require humans to vastly change their behavior to help the technology make driving safer.

Changing Infrastructure for Autonomous Vehicles

Another possible solution is to change the areas surrounding autonomous vehicles so they can better respond to driving activities. Autonomous vehicles can take information from sensors, LiDAR, and other technologies, but they don’t interact with their environment. Road sensors, for example, could give information about road conditions, allowing the autonomous vehicle to determine whether more cautious driving is necessary.

Vehicle-to-vehicle communication would allow self-driving cars to more quickly determine what other vehicles are doing, giving them more time to react and potentially avoid a collision.

As some self-driving vehicle issues are solved, new ones arise. It could be a long time — if ever — before the technology is can anticipate and deal with all obstacles. There are many companies, however, willing to work on the technology to make that happen.

Sources:

https://driving.ca/auto-news/news/motor-mouth-even-little-old-ladies-will-find-waymo-self-driving-cars-frustrating

https://www.theinformation.com/articles/waymos-big-ambitions-slowed-by-tech-trouble?shared=4596b7125469ea51

https://www.cnbc.com/2018/08/28/locals-reportedly-frustrated-with-alphabets-waymo-self-driving-cars.html

https://mashable.com/2018/08/28/waymo-self-driving-taxi-problems/#k8iy9SLLisqw

https://www.bloomberg.com/news/articles/2018-09-17/self-driving-cars-still-can-t-handle-bad-weather

http://www3.weforum.org/docs/WEF_Reshaping_Urban_Mobility_with_Autonomous_Vehicles_2018.pdf

https://bdtechtalks.com/2018/09/17/self-driving-cars-ai-computer-vision/

https://www.theregister.co.uk/2017/06/20/tesla_death_crash_accident_report_ntsb/

--

--

Wisner Baum

Appreciative of new technology advancements but keeping a vigilant eye on corporate shortcuts that put profits over consumer safety.