Do people have free will?

Nick
Nick
Mar 19, 2017 · 1 min read

Scenario

  1. Technology has advanced such that a software developer can write code to make a robot feel as humans feel. (Plausible to imagine such a future, I think.)
  2. A software developer programs a robot to feel as humans feel — while at the same time programming it to murder her estranged husband.
  3. The robot kills its target.
  4. The software developer kills herself.
  5. Authorities catch the robot.

Questions

  1. Do we punish the robot? How?
  2. Is the robot any more or less culpable for its crime than any human criminal is for theirs? How?

Conclusions?

  1. Biological lives are also programs, so we deserve the same culpability as robots — none.
  2. Biological lives also have no free will.

Philosophy as therapy

To choose what you think would require that you think it…

Welcome to a place where words matter. On Medium, smart voices and original ideas take center stage - with no ads in sight. Watch
Follow all the topics you care about, and we’ll deliver the best stories for you to your homepage and inbox. Explore
Get unlimited access to the best stories on Medium — and support writers while you’re at it. Just $5/month. Upgrade

Get the Medium app

A button that says 'Download on the App Store', and if clicked it will lead you to the iOS App store
A button that says 'Get it on, Google Play', and if clicked it will lead you to the Google Play store