Baby You Can (self) Drive My Car

Tim Chang
4 min readApr 10, 2017
  • What is the motivation for developing and building self-driving cars? What are the arguments for and against self-driving cars? Would they make our roads safer?

The main arguments I’ve heard and seen for developing self-driving cars are that they will improve safety and convenience for consumers and businesses alike. An autonomous vehicle will be much faster at making snap decisions than humans, it is better at focusing on the road than humans, and it doesn’t fall asleep. In this way, the human driver does not need to devote much cognition to maintaining the car, and they will not make the mistakes many human drivers do. Also, for businesses, autonomous vehicles do not need to stop and sleep for the night.

I’ve only really heard one argument against autonomous vehicles. The rest are just concerns about their development. The one argument is that automation will displace many people from there jobs. It’s true, self driving trucks will completely displace truck drivers and ride-sharers. And I’m definitely biased in saying this because I’m not going into the trucking business, but I think this isn’t a bad thing. It will hurt in the short run, but the market will adapt, people will find new jobs and new avenues for innovation and work. Humans have been displaced by machines before, and it’s only led us forward. The biggest concerns I’ve heard about AV are whether or not they actually make us safer, and what to do during crashes.

  • How should programmers address the “social dilemma of autonomous vehicles”? How should an artificial intelligence approach life-and-death situations? Who is liable for when an accident happens?

I agree with the point that the Bloomberg and NYT articles bring up, which is that if AVs work as well as intended, then the situations where there will be an unavoidable crash should be very rare. This of course still means that programmers need to program in behavior when crashes become unavoidable. My own thoughts on this have shifted over time. I used to think that AVs should be utilitarian and minimize harm, and in cases where there is no clear utilitarian answer, the driver should be the one to pay. Then somebody asked me if I would buy this kind of car, then I was like oh shit. Since then, I’ve been getting more news about the safety of self driving cars, and the article from Ars about self driving car accidents only affirms that. My point is that accidents resulting from the AV somehow breaking the law intentionally and leading to an accident should be virtually impossible given our technology. Thus the only times I can see a fatal accident happening with a self-driving car in self-driving mode would be a) if something unexpected happens that the tech cannot account for(someone swerves into your lane, somethings steps in front of the car, etc.) or b) the car malfunctions. In the case of a) I think the car should do its best to preserve the driver’s life. Unexpected occurrences such as someone stepping/swerving into your lane are not your fault, and could be a result of someone breaking the law. In those cases I think the driver’s life should be prioritized. In case b) the code wouldn’t have gone into effect because of the malfunction.

When accidents occur, I honestly have no idea who to assign blame to. If an accident occurs because of situation a), then I think depending on the circumstances we could more easily place blame on those who broke the law and put themselves in danger. This is assuming self-driving technology is as safe as we think it is. In situation b), it’s even more difficult to assign blame because who can you blame for a malfunction? Is it the manufacturer for faulty construction? Is it the consumer for not maintaining the car? I honestly don’t have an answer for this.

  • What do you believe will be the social, economic, and political impact of self-driving cars? What role should the government play in regulating self-driving cars?

AVs are the future. Socially this will provide an amazing boost of convenience for consumers and business alike. Economically, they will displace a whole industry of people in the short term, but in the long term they will allow us to move on to bigger more abstract things. Obviously, biased because I’m not in the trucking business, but think about the tractor, the cotton mill, assembly line. Same deal.

I think the government is doing a great job so far with its 15 points of regulation for AVs. The goal for the government I think is to protect its people, which is a sentiment that Obama voiced too. But, it should be careful not to stifle innovation. So I think the long term play here is to start out with broader regulation like the 15 points and become more specific as the technology begins to develop and the space begins to clear.

  • Would you want a self-driving car? Explain why or why not.

Absolutely. It feels great to drive on the open road, and finagle your way through the highways, but I fall asleep on road trips easily and sometimes I just don’t want to deal with traffic on a commute. I’ll be a mid-late adopter thought because I want the safety of the tech to be experienced and for many of the moral dilemmas to be worked out in practice first.

--

--