In answer to Derec’s question, Adam said, “The violet potential schematic corresponds to the Laws of Humanics. The blue one is the Zeroth Law of Robotics.”

“Beg your pardon?” Janet asked. “Laws of Humanics? Zeroth Law? What are you talking about?”

Her learning machine looked over at her and said, “We have attempted to develop a set of laws governing human behavior, laws similar to the ones that govern our own. They are, of course, purely descriptive rather than compulsory, but we felt that understanding them might give us an understanding of human behavior which we otherwise lacked. As for the Zeroth Law, we felt that the Three Laws were insufficient in defining our obligations toward the human race in general, so we attempted to define that obligation ourselves.” I

Janet was careful not to express the joy she felt, for fear of influencing the robot somehow, but inside she was ecstatic. This was perfect! Her experiment was working out after all. Her learning machines had begun to generalize from their experiences. “ And what did you come up with?” she asked.

“Bear in mind that these laws describe potential conditions within a positronic brain, so words are inadequate to describe them perfectly; however, they can be expressed approximately as follows. The First Law of Humanics: All beings will do that which pleases them most. The Second Law of Humanics: A sentient being may not harm a friend, or through inaction allow a friend to come to harm. The Third Law of Humanics: A sentient being will do what a friend asks, but a friend may not ask unreasonable things.” He paused, perhaps giving Janet time to assimilate the new laws’ meanings.

Not bad. Not bad at all. Like he’d said, they certainly weren’t compulsory as far as most humans went, but Janet doubted she could have done any better. “And what is your Zeroth Law?” she asked.

“That is much more difficult to state in words, but a close approximation would be that any action should serve the greatest number of humans possible.” Adam nodded toward Lucius. “Lucius has taken the Law a step farther than Eve or I, and we believe it was that step which led him to do what he did to Dr. Avery. He believes that the value of the humans in question should also be considered. “

Eve. She’d guessed right. “And you don’t?”

Adam raised his arms with the palms of his hands up. It took Janet a moment to recognize it as a shrug, since she’d never seen a robot use the gesture before. Adam said, “I am…uncomfortable with the subjectivity of the process. I had hoped to find a more definite operating principle.”

“But Lucius is satisfied with it.”

“That seems to be the case. “

“Why do you suppose he is and you aren’t?”

“Because,” Adam said, again hesitating. “Because he believes himself to be human.”

If the robot were hoping to shock her with that revelation, he was going to be disappointed. Janet had expected something like this would happen from the start; indeed, in a way it was the whole point of the experiment. She waited patiently for the question she knew was coming.

Adam didn’t disappoint her. He looked straight into her eyes with his own metallic ones and said, “Each of us has struggled with this question since we awakened, but none of us have been able to answer it to our mutual satisfaction. You created us, though. Please tell us: are we human?”

Janet used the same palms-up gesture Adam had used. “I don’t know. You tell me.”

Adam knew the sudden surge of conflicting potentials for what it was: frustration. He had experienced enough of it in his short life to recognize it when it happened. This time the frustration came from believing his search for truth was over and suddenly finding that it wasn’t.

He felt a brief Second Law urge to answer her question with a simple declarative statement, but he shunted that aside easily. She obviously wanted more than that, and so did he. She wanted to see the reasoning behind his position; he wanted to see if that reasoning would withstand her scrutiny.

He opened a comlink channel to Eve and explained the situation to her. Together they tried to establish a link with Lucius, but evidently the five volts Derec was supplying him hadn’t been enough to wake him. They would have to do without his input. Adam wasn’t all that disappointed; Lucius’s reasoning had led him to violate the First Law.

Janet was waiting for Adam’s response. Carefully, consulting with Eve at every turn, he began to outline the logic that had led them to their conclusion that any intelligent organic being had to be considered human. He began with his own awakening on Tau Puppis IV and proceeded through the incident with the Ceremyons, through Lucius’s experiments in creating human beings in Robot City, through the robots’ return to Tau Puppis and their dealings with the Kin, to their final encounter with Aranimas. He explained how each encounter with an alien being reinforced the robots, belief that body shape made no difference in the essential humanity of the mind inside it, and how those same contacts had even made differences in intelligence and technological advancement seem of questionable importance.

Throughout his presentation, Adam tried to judge Janet’s reaction to it by her facial expression, but she was giving nothing away. She merely nodded on occasion and said, “I’m with you so far.”

At last he reached the concept of Vitalism, the belief that organic beings were somehow inherently superior to electromechanical ones, and how the robots could find no proof of its validity. He ended with, “That lack of proof led Lucius to conclude that Vitalism is false, and that robots could therefore be considered human. Neither Eve nor I-nor Mandelbrot, for that matter-were able to convince ourselves of this, and now that Lucius ‘ s belief has led him into injuring a human, we feel even less comfortable with it. We don’t know what to believe.”

Adam waited for her response. Surely she would answer him now, after he had laid out the logic for her so meticulously.

His frustration level rose to a new height, however, when she merely smiled an enigmatic smile and said, “I’m sure you’ll figure it out.”

Derec felt just as frustrated as Adam. He had hoped that finding his mother would knock loose some memories from his amnesic brain, but so far nothing had come of the encounter except a vague sense of familiarity that could be easily attributed to her similarity to Avery.

She seemed just like him in many ways. He was a competent roboticist, and so was she. Avery never divulged information to anyone if he could help it, and evidently neither did she. Avery was always testing someone, and here she stood, leading poor Adam on when it was obvious she didn’t know the answer to his question either.

He glanced up at the monitor, checking to see if the signal was any clearer. While Janet and Adam had been talking, he had been trying to trace another unfamiliar potential pattern in Lucius’s brain, this one an indistinct yellow glow surrounding an entire level of activity, but the monitor’s trace circuitry couldn’t isolate the thought it represented. Whatever it was, it fit none of the standard robotic thought patterns.

He heard Janet say, “I’m sure you’ll figure it out,” and took that as his cue. “Adam, maybe you can help me figure this out. What’s that pattern represent?”

Adam looked up to the monitor. “I do not recognize it,” he said.

“Can you copy it and tell me what it does?”

“I do not wish to contaminate my mind with Lucius’s thought patterns.”

“Put it in temporary storage, then.”

Adam looked as if he would protest further, but either the Second Law of Robotics or his belief that Derec would follow the Third Law of Humanics made him obey instead. He fixed his gaze on the monitor for a moment, then looked away, toward the wall.


Перейти на страницу:
Изменить размер шрифта: