Driverless cars…the story continues.

A question of ethics

Last November, I wrote a blog about autonomous (driverless) Uber vehicles. I painted a world of cheap transport, low fatalities and reduced emissions.


The blog was widely read suggesting a growing interest in this issue. So I read with interest an article in this morning’s (7 January 2017) Age newspaper dealing with the ethics of autonomous vehicles.

The article by Marcus Strom entitled “The pedestrian or the passenger” begins with a “rail switch” hypothetical. An autonomous vehicle containing mum, dad and their daughter has lost its brakes and is ploughing towards three elderly pedestrians crossing the road at a controlled crossing. What should the vehicle (or more particularly the computer guiding the vehicle) do? Continue on and probably kill the pedestrians? Or steer into a barrier and probably kill the passengers?

While the technology underpinning autonomous vehicles is moving ahead at a great pace, the ethics of such vehicles is being treated as an irrelevant sideshow. How would a programmer configure the vehicle to respond in the above situation? What programming best reflects human values?

The Mercedes-Benz F015 (concept) luxury autonomous vehicle is designed to avoid collisions but not if it results in injury to its passengers. In the above scenario, it will continue on and probably kill the three pedestrians.

The article talks about autonomous vehicle projects in Israel. BMW and Intel will have monitored driverless vehicles on the streets of Munich and Jerusalem by 2021.

In Perth, Western Australia, a driverless bus is being trialled. In Melbourne, Victoria, autonomous vehicles will be trialled on the major freeways by the end of this year.

Another ethical issue. Will we tolerate any road deaths caused by computer malfunction? The first recorded death caused by an autonomous vehicle occurred in May 2016, in Ohio, US when a Tesla autonomous vehicle struck a truck/tractor it did not “see” as neither the driver or the autopilot recognised the white side of the truck/tractor against the brightly lit sky.

In the US (2015), there were 35,000 road fatalities. If autonomous vehicles reduced this number to 10,000, would that be acceptable? Would 5,000 be acceptable? 200?

The equivalent reduction (200) would mean 10 road deaths for the whole of Australia. It is unlikely that with autonomous vehicles there would be zero fatalities so what number, if any, is acceptable?

Plenty in this article.

Food for thought…..

Photo courtesy of the Sydney Morning Herald

markjattard.com