I, Robot Morality and Ethics Quotes

How we cite our quotes: (Story title.paragraph)

Quote #7

"Positronic brains were constructed that contained the positive aspect only of the Law, which in them reads: 'No robot may harm a human being.' That is all. They have no compulsion to prevent one coming to harm through an extraneous agency such as gamma rays." (Little Lost Robot.50)

The Nestors that get used at Hyper Base have a slightly altered sense of morality: they can't hurt people, but they can let them get hurt. The government thought they needed these robots for their research; but Calvin knows that pulling out one element of the Three Laws means the system will be unstable, like a game of Jenga. So the Three Laws aren't just individual moral ideas, but a whole system that works together.

Quote #8

"Now that they've managed to foul theirs up, we have a clear field. That's the nub, the... uh... motivation. It will take them six years at least to build another and they're sunk, unless they can break ours, too, with the same problem." (Escape.16)

Consolidated broke their super-computer and now they want to destroy the super-computer of US Robots. Why? Just because they're competitors. Perhaps this reminds us that humans are not as moral as robots, especially if they're in business.

Quote #9

"Because, if you stop to think of it, the three Rules of Robotics are the essential guiding principles of a good many of the world's ethical systems. … To put it simply—if Byerley follows all the Rules of Robotics, he may be a robot, and may simply be a very good man." (Evidence.138)

This is one of Calvin's main ideas in this book: robots are more moral than humans because they have to follow the Three Laws. Robots aren't simply moral because they have to follow rules, but because the rules they follow are moral guiding principles, the same that people should follow.