Do people have free will?
Published in
1 min readMar 19, 2017
Scenario
- Technology has advanced such that a software developer can write code to make a robot feel as humans feel. (Plausible to imagine such a future, I think.)
- A software developer programs a robot to feel as humans feel — while at the same time programming it to murder her estranged husband.
- The robot kills its target.
- The software developer kills herself.
- Authorities catch the robot.
Questions
- Do we punish the robot? How?
- Is the robot any more or less culpable for its crime than any human criminal is for theirs? How?
Conclusions?
- Biological lives are also programs, so we deserve the same culpability as robots — none.
- Biological lives also have no free will.