Chapter 5. Human Nature
Wolruf woke to bright sunlight striking her full in the face. She raised her head, sniffing the air, but it was the same dead, boring, metallic-smelling air she’d come to associate with the city. She squinted into the sunlight and saw that it came from a viewscreen. She growled a curse. She’d been dreaming of home again, a home full of others of her own kind; a busy, happy place full of the noise and smells and sights of people doing things. To wake up here in this silent metal cell was an insult to the senses.
She stretched her arms and yawned, still tired. Despite the dreams of home, she had slept poorly, as she had for-how long? Months? She hadn’t been counting. Still, she didn’t think she’d ever been so restless in her life. She knew what was causing it: too much time away from her own kind and her recent experiences with a species that was close to her both physically and socially-but knowing the cause didn’t make it go away. And hearing Derec talk about his mother didn’t help, either. His open enthusiasm at the prospect of regaining a bit of his past had only reminded Wolruf of what she still missed.
But she didn’t need to stay away any longer. Now that Aranimas was out of the picture, and with him her obligation to work off the family debt in his service, she could go back any time she wanted. Her family would welcome her openly, especially so if she brought with her this robot technology of Avery’s.
That was the problem, the one factor in the equation that refused to come clear for her. Should she take robots home with her and start an economic and social upheaval that would surely disrupt the normal pace of life there, or should she keep them secret, forget about her time among robots, and just go back to the home she remembered so fondly? And what would happen if she did that? Was Ariel right? Would her home become a backward place, an enclave of curiously anachronistic behavior, while the rest of the galaxy developed in ways her people would eventually be unable even to comprehend?
Wolruf didn’t know what to believe, nor why the choice had to be hers. She had never asked for that kind of power over her own people.
With a sigh, she got up, showered, and stood under the blow drier until she could feel its heat against her skin. She laughed at her image in the mirror-she looked twice her usual size and puffy as a summer cloud-but a quick brushing restored her coat to its usual smoothness.
All her thoughts of home made her consider another piece of the puzzle as well, and she turned to the intercom panel beside her bed and said, “Central, what ‘as ‘appened to my ship, the Xerborodezees? ‘ Ave you kept it for me?”
“It has been stored, but can be ready for use with a day’s notice. Do you wish us to prepare it for you?”
“Not yet. Maybe soon, though. Thanks.”
“You are welcome, Mistress Wolruf.”
Wolruf felt a bit of her tension ease. If she decided not to take any of the new technology home with her, she would need the Xerbo, for as far as she knew, it was the only noncellular ship on the planet. She considered going to check on it herself, wherever it might be stored, but decided not to. There was no reason to doubt Central’s word about it.
She opened the door and padded out into the kitchen to get breakfast. The apartment was silent; Derec and Ariel were still asleep, and the robots were being quiet wherever they were. As Wolruf stood before the automat, trying to decide between her four favorite breakfasts, she realized how much she had grown used to the humanway of doing things. She hadn’t even considered cooking her own meal. She had fallen completely out of the, habit. Nor had she shopped for food-or anything else, for that matter-since she had come into Derec and Ariel’s company.
Was that necessarily bad? Wolruf’s kind had been hunting and farming their food for millennia, and probably shopping for nearly as long; maybe it was time to move on to other things.
Maybe. But how could she know for sure?
From his place in the living room, seated on one of the couches, Lucius was aware of Wolruf entering the dining room with her breakfast. He sensed the others’ awareness as well; their comlink network paused momentarily while each of them gauged the relative degree of threat she presented to them. It was an inconvenience, this constant state of alert; it slowed their rate of exchange; but they were taking no more chances with a complete fugue state.
Wolruf presented no immediate threat. The silent network continued where it had left off, with Adam speaking.
Consider the distinction between ‘sufficient’ and ‘necessary’ conditions, he said. We have already concluded that if a being is both intelligent and organic, then it is functionally human, but those are merely sufficient conditions. They are not necessary conditions. They contain an inherent prejudice, the assumption that an organic nature can somehow affect the quality of the intelligence it houses. I call that concept ‘Vitalism,’ from the ancient Terran belief that humans differed from animals through some ‘vital’ spark of intelligence. You should note that while the concept has historically been considered suspect, it has neither been proven nor disproven. Lucius has pointed out that if Vitalism is false, then the only necessary condition for humanity is intelligence. Discussion?
Eve said, Derec has already hinted that this may be so. On the planet we call Ceremya, he indicated that Lucius could consider himself human if he wished.
Mandelbrot had been included in their discussion this time. He said, I believe he was being sarcastic. He often is. But even if he meant what he said, you also remember the outcome of that redefinition. If Lucius considers himself human, then he must still follow the orders of other humans.Functionally, he only increases his burden to include other robots as potential masters.
That is true; however, I have discovered another consequence,said Lucius. If I consider myself human, then the Third Law becomes equal to the First. I can no more allow harm to myself than to any other intelligent being. I consider that an improvement over the interpretation of the laws wherein a human could order me to dismantle myself, and I would have to obey.
I don ’ t believe you would obey such an order anyway,said Mandelbrot.
I would attempt to avoid it by denying the humanity of the being in question,Lucius admitted. With Avery or Wolruf I would probably succeed, but as things stand, if Derec or Ariel were to order it, the compulsion might force me to obey.
Perhaps the Zeroth Law would provide an alternative,Mandelbrot said.
Immediately, both Adam and Eve said, No. Eve continued, saying, Let ’ s leave the Zeroth Law out of it for now.
You can ’ t make it go away by ignoring it,Lucius said. The Zeroth Law applies here. If we consider our duty to humanity in general, then we can easily conclude that dismantling ourselves would be of little use in the long term. However, possible long-term advantage does not outweigh a definite Second Law obligation to obey. Depending upon the value of the human giving the order, we might still be forced to follow it. But if we consider ourselves human, and thus part of humanity, then disobeying an order to self-destruct saves one human life immediately and also allows us to serve humanity in the future. The Second Law obligation to obey is then safely circumvented.
Safely for whom?Adam asked. What if your destruction would save the human giving the order? Suppose, for instance, the bomb that Avery used to destroy Aranimas ’ s ship had to be detonated by hand instead of by a timed fuse. We have already agreed that destroying the ship was acceptable under the Zeroth Law, but what if we factor in the humanity of the fuse?