The moral dilemma of driver-less cars: save 10 pedestrians or save yourself?
Paul Dughi

A lot of people have focused here on the study’s dynamics and whether framing the question this way is fair. I don’t disagree with their concerns, but the IDEA was to force a conversation and understand the difficulty in programming autonomous vehicles.

I’m fairly certain 10 pedestrians won’t “suddenly” block your path with concrete barriers on both sides of the street and put the autonomous car (or you as driver of your own car) into this situation. If it happens, it’s really, really rare. But it does point to the complexity of building the OS for these vehicles. Maybe this situation seems unlikely, but there are millions of situations that are likely and they have to be accounted for. With that much code, is it unthinkable that there might be bugs to work out or operator programming error?

For gosh sakes, my Smart TV keeps crashing and rebooting and it’s one of the top names in the business. My I-phone occassionally reboots and every computer ever invented has issues at times. Is it crazy to think the same might happen here?

I’m not against the idea of driverless cars. In fact, done right, it might make everything safer because you’re taking human error out of the equation. But it has to be done correctly. I think that’s worthy of discussion and awareness.

From what I’ve observed, the solution so far seems to be low speeds. Over time perhaps they will be able to increase speeds. I doubt anyone would be excited about an autonomous car right now that goes 25 mph on a highway to be safe when we’re used to driving 70.

Like anything else, version 2.0 will be better… version 3.0 even better than that.

Show your support

Clapping shows how much you appreciated Paul Dughi’s story.