In all the books i've read they state over and over "The Three Law of Robotics are Hardwired into their positronic brains." Now thats great, as a robot without the three laws would kill you as soon as you ordered it to do something.

Thats straight from one of the books, i'm not sure of the basis but it has something to do with not wanting to be a slave. Okay, but they keep mucking with the laws. In one story they modify the first law, instead of "A robot may not injure a human being, or, through inaction, allow a human being to come to harm." they change it to "A robot may not injure a human being." That may not seem like such a big deal, but in the story they make an example "Say you order a robot to stand at the top of a building and order him to drop directly above you, now even with the modified rule he will not do this. But, if you state that you will move out of the way before the rock hits, he will drop the rock, and if the robot sees you not moving he will not attempt to stop the rock from crushing you." Granted, they need to modify the rule because robots were constantly running into the path of radiation, ruining them, and the results of the experiments.

In another story the make the 3rd rule of robotics slightly more important than the 2nd rule, because of the expensive cost of this particular robot. Then the Zeroth law comes into play, in one story a robotic computer ruins the lives of several individuals in order to better serve humanity. Almost, but not totally violating the first law. So, the fact that they can modify the laws is incredibly dangerous, as once you modify the first rule you've got the makings of a robotic army on your hands.

I just finished reading "The Naked Sun" and its sort of a robotic mystery novel, where a robot has committed the crime. This seems impossible due to the first law, but you can circumvent it as long as you do it precisely. It requires two robots, you tell the first that "Put this harmless liquid into the milk, the milk will be poured out later, I just need to see the results." Now, the harmless liquid is of course not harmless, but a poison. Now a second robot comes and delivers the milk to the victim. The first rule is actually "A robot may do nothing that to its knowledge, will harm a human being; nor, through inaction knowingly allow a human being to come to harm.