“Then the matter is easy to follow,” responded the psychologist, dryly. “If Mr. Byerley breaks any of those three rules, he is not a robot. Unfortunately, this procedure works in only one direction. If he lives up to the rules, it proves nothing one way or the other.”

Quinn raised polite eyebrows, “Why not, doctor?”

“Because, if you stop to think of it, the three Rules of Robotics are the essential guiding principles of a good many of the world’s ethical systems. Of course, every human being is supposed to have the instinct of self-preservation. That’s Rule Three to a robot. Also every ‘good’ human being, with a social conscience and a sense of responsibility, is supposed to defer to proper authority; to listen to his doctor, his boss, his government, his psychiatrist, his fellow man; to obey laws, to follow rules, to conform to custom – even when they interfere with his comfort or his safety. That’s Rule Two to a robot. Also, every ‘good’ human being is supposed to love others as himself, protect his fellow man, risk his life to save another. That’s Rule One to a robot. To put it simply – if Byerley follows all the Rules of Robotics, he may be a robot, and may simply be a very good man.”

“But,” said Quinn, “you’re telling me that you can never prove him a robot.”

“I may be able to prove him not a robot”

“That’s not the proof I want.”

“You’ll have such proof as exists. You are the only one responsible for your own wants.”

Here Lanning’s mind leaped suddenly to the sting of an idea, “Has it occurred to anyone,” he ground out, “that district attorney is a rather strange occupation for a robot? The prosecution of human beings – sentencing them to death – bringing about their infinite harm-”

Quinn grew suddenly keen, “No, you can’t get out of it that way. Being district attorney doesn’t make him human. Don’t you know his record? Don’t you know that he boasts that he has never prosecuted an innocent man; that there are scores of people left untried because the evidence against them didn’t satisfy him, even though he could probably have argued a jury into atomizing them? That happens to be so.”

Lanning’s thin cheeks quivered, “No, Quinn, no. There is nothing in the Rules of Robotics that makes any allowance for human guilt. A robot may not judge whether a human being deserves death. It is not for him to decide. He may not harm a human-variety skunk, or variety angel.”

Susan Calvin sounded tired. “Alfred,” she said, “don’t talk foolishly. What if a robot came upon a madman about to set fire to a house with people in it? He would stop the madman, wouldn’t he?”

“Of course.”

“And if the only way he could stop him was to kill him-”

There was a faint sound in Lanning’s throat. Nothing more.

“The answer to that, Alfred, is that he would do his best not to kill him. If the madman died, the robot would require psychotherapy because he might easily go mad at the conflict presented him -of having broken Rule One to adhere to Rule One in a higher sense. But a man would be dead and a robot would have killed him.”

“Well, is Byerley mad?” demanded Lanning; with all the sarcasm he could muster.

“No, but he has killed no man himself. He has exposed facts which might represent a particular human being to be dangerous to the large mass of other human beings we call society. He protects the greater number and thus adheres to Rule One at maximum potential. That is as far as he goes. It is the judge who then condemns the criminal to death or imprisonment, after the jury decides on his guilt or innocence. It is the jailer who imprisons him, the executioner who kills him. And Mr. Byerley has done nothing but determine truth and aid society.

“As a matter of fact, Mr. Quinn, I have looked into Mr. Byerley’s career since you first brought this matter to our attention. I find that he has never demanded the death sentence in his closing speeches to the jury. I also find that he has spoken on behalf of the abolition of capital punishment and contributed generously to research institutions engaged in criminal neurophysiology. He apparently believes in the cure, rather than the punishment of crime. I find that significant.”

“You do?” Quinn smiled. “Significant of a certain odor of roboticity, perhaps?”

“Perhaps. Why deny it? Actions such as his could come only from a robot, or from a very honorable and decent human being. But you see, you just can’t differentiate between a robot and the very best of humans.”

Quinn sat back in his chair. His voice quivered with impatience. “Dr. Lanning, it’s perfectly possible to create a humanoid robot that would perfectly duplicate a human in appearance, isn’t it?”

Lanning harrumphed and considered, “It’s been done experimentally by U. S. Robots,” he said reluctantly, “without the addition of a positronic brain, of course. By using human ova and hormone control, one can grow human flesh and skin over a skeleton of porous silicone plastics that would defy external examination. The eyes, the hair, the skin would be really human, not humanoid. And if you put a positronic brain, and such other gadgets as you might desire inside, you have a humanoid robot.”

Quinn said shortly, “How long would it take to make one?”

Lanning considered, “If you had all your equipment – the brain, the skeleton, the ovum, the proper hormones and radiations – say, two months.”

The politician straightened out of his chair. “Then we shall see what the insides of Mr. Byerley look like. It will mean publicity for U. S. Robots – but I gave you your chance.”

Lanning turned impatiently to Susan Calvin, when they were alone. “Why do you insist-”?

And with real feeling, she responded sharply and instantly, “Which do you want – the truth or my resignation? I won’t lie for you. U. S. Robots can take care of itself. Don’t turn coward.”

“What,” said Lanning, “if he opens up Byerley, and wheels and gears fall out what then?”

“He won’t open Byerley,” said Calvin, disdainfully. “Byerley is as clever as Quinn, at the very least”

The news broke upon the city a week before Byerley was to have been nominated. But “broke” is the wrong word. It staggered upon the city, shambled, crawled. Laughter began, and wit was free. And as the far off hand of Quinn tightened its pressure in easy stages, the laughter grew forced, an element of hollow uncertainty entered, and people broke off to wonder.

The convention itself had the sir of a restive stallion. There had been no contest planned. Only Byerley could possibly have been nominated a week earlier. There was no substitute even now. They had to nominate him, but there was complete confusion about it.

It would not have been so bad if the average individual were not torn between the enormity of the charge, if true, and its sensational folly, if false.

The day after Byerley was nominated perfunctorily, hollowly – a newspaper finally published the gist of a long interview with Dr. Susan Calvin, “world famous expert on robopsychology and positronics.”

What broke loose is popularly and succinctly described as hell.

It was what the Fundamentalists were waiting for. They were not a political party; they made pretense to no formal religion. Essentially they were those who had not adapted themselves to what had once been called the Atomic Age, in the days when atoms were a novelty. Actually, they were the Simple-Lifers, hungering after a life, which to those who lived it had probably appeared not so Simple, and who had been, therefore, Simple-Lifers themselves.

The Fundamentalists required no new reason to detest robots and robot manufacturers; but a new reason such as the Quinn accusation and the Calvin analysis was sufficient to make such detestation audible.

The huge plants of the U. S. Robot amp; Mechanical Men Corporation was a hive that spawned armed guards. It prepared for war.


Перейти на страницу:
Изменить размер шрифта: