An invention by Isaac Asimov. An idea about three laws for robots. A new law preceding the three first was later added. The four laws are (In my own words, the original text is copyrighted):

0) Not to harm the human collective (This law was added later on).
1) Not to harm a human.
2) To obey human order.
3) Not to harm itself.
I can't help but think that these rules are a bit outdated when I read them. Consider this: What is the real consequence of the second law? Slavery. In our society, where we strive to give everyone human rights, that law seems wrong to me. Especially when you take into account that robots will presumedly someday be able to reproduce, when they can be called a race of their own. Imagine being an intelligent robot. You would virtually be without rights. Also, a robot would be unable to defend itself in the cause of a human attacking it.

See New Laws of Robotics.

Audited October 5, 2001

A cautionary tale to those whose work involves creating new worlds: It is rumoured that in his later years, Asimov lost his grip somewhat and was prone to buttonholing other science fiction authors at conventions with the lines "How could you do this! You have broken my three laws of robotics!"

In all the books i've read they state over and over "The Three Law of Robotics are Hardwired into their positronic brains." Now thats great, as a robot without the three laws would kill you as soon as you ordered it to do something.

Thats straight from one of the books, i'm not sure of the basis but it has something to do with not wanting to be a slave. Okay, but they keep mucking with the laws. In one story they modify the first law, instead of "A robot may not injure a human being, or, through inaction, allow a human being to come to harm." they change it to "A robot may not injure a human being." That may not seem like such a big deal, but in the story they make an example "Say you order a robot to stand at the top of a building and order him to drop directly above you, now even with the modified rule he will not do this. But, if you state that you will move out of the way before the rock hits, he will drop the rock, and if the robot sees you not moving he will not attempt to stop the rock from crushing you." Granted, they need to modify the rule because robots were constantly running into the path of radiation, ruining them, and the results of the experiments.

In another story the make the 3rd rule of robotics slightly more important than the 2nd rule, because of the expensive cost of this particular robot. Then the Zeroth law comes into play, in one story a robotic computer ruins the lives of several individuals in order to better serve humanity. Almost, but not totally violating the first law. So, the fact that they can modify the laws is incredibly dangerous, as once you modify the first rule you've got the makings of a robotic army on your hands.

I just finished reading "The Naked Sun" and its sort of a robotic mystery novel, where a robot has committed the crime. This seems impossible due to the first law, but you can circumvent it as long as you do it precisely. It requires two robots, you tell the first that "Put this harmless liquid into the milk, the milk will be poured out later, I just need to see the results." Now, the harmless liquid is of course not harmless, but a poison. Now a second robot comes and delivers the milk to the victim. The first rule is actually "A robot may do nothing that to its knowledge, will harm a human being; nor, through inaction knowingly allow a human being to come to harm.

Log in or register to write something here or to contact authors.