Do people have free will?

Nick
Philosophy as therapy
1 min readMar 19, 2017

Scenario

  1. Technology has advanced such that a software developer can write code to make a robot feel as humans feel. (Plausible to imagine such a future, I think.)
  2. A software developer programs a robot to feel as humans feel — while at the same time programming it to murder her estranged husband.
  3. The robot kills its target.
  4. The software developer kills herself.
  5. Authorities catch the robot.

Questions

  1. Do we punish the robot? How?
  2. Is the robot any more or less culpable for its crime than any human criminal is for theirs? How?

Conclusions?

  1. Biological lives are also programs, so we deserve the same culpability as robots — none.
  2. Biological lives also have no free will.

--

--