Tuesday, February 24, 2009

The three rules of robotics apply to Eve

I saw Wall-E recently with my boys, and loved the movie. Compared to the general fast-food like frenzy of most modern children's cartoons, Wall-E seems like a slow-food gourmet feast.

I was particularly impressed with how well the movie carried itself even lacking dialog during the early part of the movie. Harpo Marx would have been proud.

Reflecting upon it now, I am trying to determine whether the hypothetical designer of all those robots kept in mind Asimov's rules of robotics when they were made:

1. do not harm humans through action or inaction
2. obey orders except where conflicting with 1
3. do not harm self except where conflicting with 1 or 2

I think that that must have been part of the vision for the movie, that the robots would each be adhering to the rules of robotics but through differentiated interpretations would come into conflict. It was clear that Auto thought that it was serving rule 1; and Wall-E was willing to take on injury to himself in order to follow rule 2, thus demonstrating the solidity of the hierarchy in his mind. But what about Eve?

Eve was clearly a rule follower, and was willing to accept personal injury in order to meet her objectives - so the hierarchy is programmed into her functioning. Yet she discards her objective when, I surmise, she interprets her actions as having endangered Wall-e and continuing to endanger him. I can't reconcile that with the rules of robotics; had Wall-E not persuaded her otherwise, would she have given up on her mission for his sake?

Research a little further and an answer emerges: the Zeroth law. The Zeroth law, number such that it would precede the others, is in summary:

0. a robot should not harm humanity through action or inaction.

Even in the Asimov books, it was not often invoked and perhaps was only the faintest of subprocesses in a purpose-built positronic brain such as Eve's. One could speculate that it was Wall-E's actions in retrieving the plant for Eve after she'd discarded it that were required to invoke this subprocessing. I like that; the story has a neat arc to it if Eve's disregard of her objective was necessary to invoke her zealous pursuit of it once her objective was related to a higher purpose.

But this does highlight a missing 4th rule for robotics: do not harm other robots unless conflicting with the other laws. Then again, if that law was there, then perhaps all the characters would have remained on the Axiom and never headed to Earth, as to do so would endanger the robots.

No comments: