I, Robot (2004)
The Three Laws of Robotics:
Warning: this review may contain spoilers.
The movie I, Robot is "inspired by" the book of the same name—a collection of short stories—by Isaac Asimov. I'm currently reading The Complete Robot which is, I believe, an extended version of I, Robot with a couple of extra stories. When I first saw this movie, I was familiar with the Three Laws, but not the content of the book. I remember, at the time, there seemed to be a group of Asimov fans expressing their disappointment at the path the movie took—but I thought it was actually a quite logical outgrowth of the three laws…
The movie opens with an apparent suicide at the US Robotics building; a suicide which the robophobic detective investigating the case believes may be something impossible: a murder of a human being by a robot. The tale leads us down that path to a full-blown robot uprising, led by an AI that believes (no doubt quite rightly) that the biggest threat to human beings is other human beings, and that the only way to protect us is to enslave us.
Now that I have watched the movie again, I am well aware of all the ways in which it differs from the world Asimov created. Robots on Earth, for instance, were outlawed in the short stories (up until the very end, anyway.) And Dr Susan Calvin is very different from the image of her I received from the book. But those are changes that, frankly, I understand to be important for wider acceptance of the movie. Most importantly, though: how does the movie's application of the Three Laws hold up?
I still think the AI's logic is correct. Asimov himself speculated on such conflicts at a smaller scale, and stated (I'm paraphrasing) that a robot would absolutely kill one person to save ten. For the robotic, positronic brain, it all comes down to the numbers. Therefore, would a robot come up with the plot of imprisoning humanity to protect us from ourselves? Would it kill those few humans who got in its way to protect that plot? Would it harm a few to save many, even if its idea of our salvation differs from ours?
The logic is undeniable…
As a Will Smith action vehicle, I, Robot works quite well. As an Asimov Robot story? Perhaps it goes further than Asimov would have liked—it is very much a "robot as menace" story, and he seemed to avoid those; indeed, the First Law is intended to ensure that such things are impossible—but it really is nothing more than an extension of situations Asimov himself postulated. I think it works there too…
And, hey, I enjoyed it!
IMDB: I, Robot