The Three Laws of Robotics are three rules which Isaac Asimov thought would be programmed into robots.

1) A robot may not injure a human being or, by failing to act, allow a human being to come to harm.
2) A robot must obey orders given to it by human beings, except where carrying out those orders would break the First Law.
3) A robot must protect its own existence, as long as the things it does to protect itself do not break the First or Second Law.

Later, Asimov added the Zeroth Law: “A robot may not harm humanity, or, by inaction, allow humanity to come to harm”; the rest of the laws are modified sequentially to acknowledge this.

%d bloggers like this: