What if we, as humans, followed Isaac Asimov’s Robot Laws?
Isaac Asimov’s “Three Laws of Robotics”
- A robot may not injure a human being or, through inaction, allow a human being to come to harm.
- A robot must obey orders given it by human beings except where such orders would conflict with the First Law.
- A robot must protect its own existence as long as such protection does not conflict with the First or Second Law.
I have an idea.
What if all humans were compelled to follow these laws as if they were hard-coded into our very being? How would the world change?
I propose “Three Laws of Humanity”
- A human must not injure another human being or, through inaction, allow another human to come to harm.
- A human must always follow their conscience, as long as it does not conflict with the First Law.
- A human must protect their own existence as long as such protection does not conflict with the First or Second Law.