The problem was one of ambiguity. A good operating principle needed to be clear and concise if it was to be of any value in a crisis, yet every time they attempted to distill a simple statement of truth out of the jumble of data, they found themselves faced with logical loopholes allowing-sometimes even demanding-unacceptable behavior.
The best definition they had come up with yet, based upon Dr. Avery’s recent destruction of the ship belonging to the pirate Aranimas, stated simply that the number of people served by an action determined the relative propriety of that action. On first consideration it seemed to hold up in Avery’s case; if he hadn’t stopped Aranimas, then Aranimas would have killed not only Avery, Derec, Ariel, and Wolruf, but an entire city full of the alien Kin as well. But when one added into the equation the other crew members on board Aranimas’s ship who had also died in the explosion, the balance logically tipped the other way. The ship had been enormous; much larger than the city. It almost certainly had a population commensurate with its size. And if that was the case, then more lives would have been saved if they had not resisted.
Granted, those lives were not human lives, not by the strictest definition of the term, but the robots had long since decided that a narrow definition was functionally useless. Any intelligent organic being had to be considered human if one was to avoid genocidal consequences arising from a “true” human’s casual order.
The robots might have argued that no one had expected to destroy the pirate ship with a single bomb, but the humans in the city, Wolruf included, seemed to feel even after the fact that disabling Aranimas and killing all his crew was preferable to sacrificing themselves. They were so certain of it that the robots could only accept their certainty as right-meaning generally accepted human behavior-and try to factor it somehow into the Zeroth Law.
They communicated via comlink, information flowing at thousands of times the rate possible using normal speech, but so far that speed had not helped them solve the dilemma.
I believe we need to consider the value of the individual humans in question,Lucius sent. When we factor in value, the equation balances.
But how can we assign a human a value?Adam asked. All are considered equal. in their own law as well as our programming.
Not so,Lucius replied. Not all human law makes such a distinction. Furthermore, we are allowed to exercise judgment in our response to orders, so that we need not follow those of the insane or the homicidal. That suggests the possibility that humans can be assigned a relative worth based upon the quality of their orders to robots. Since their orders reflect their intentions, we can assume that those intentions could be used to determine their relative value in lieu of direct orders.
Without agreeing or disagreeing,Eve sent, I point out that humans change over time. Take Dr. Avery for example.When we first encountered him, he was openly murderous, but he has gradually grown less so until just recently he risked his own life to save those of his shipmates. How can we assign a value to a changing quantity?
After a few nanoseconds’ hesitation, Lucius replied, Everything changes, even inanimate objects. A quantity of sand may later become a window, yet we do not worry about protecting sand. nor the window after it has broken. Only its current value is important.
What about old people?Adam sent. Are old people inherently less valuable than young, then?
Women and children traditionally get the first seats in a lifeboat,Lucius pointed out.
True. Still, I am uncomfortable with the concept of value judgment. I don ’ t believe it ’ s a robot ’ s place to decide.
But if we are to follow a Zeroth Law, we have no choice.We must
THIRDLAW OVERRIDE. The warning swept into their collective consciousness like a tidal wave, obliterating their conversation. THIRDLAW OVERRIDE. One of them was being damaged.
It took only an instant to separate out the source of the signal: it was coming from Lucius. Just as quickly, Lucius abandoned the comlink and accessed his somatic senses again. The data line leading to and from his right leg was awash in conflicting signals. He powered up his eyes, swiveled them downward, and saw Dr. Avery holding his severed leg in one hand and a cutting laser in the other, a malevolent grin spread across his face.
Lucius’s reaction was immediate: he kicked off with his good leg and pushed with his arms to put some distance between himself and Avery, at least until he could figure out what was happening. The moment he began to move, however, an intense magnetic field shoved him back into place. It didn’t stop there, but squeezed him tighter and tighter, deforming his arms, his one remaining leg, even his eyes, until he was once again an undifferentiated ball, as he had been when he first achieved awareness. The magnetic field was too strong to fight, and growing stronger yet. Now it was even interfering with his thought processes. Lucius felt a brief moment of rising panic, and then he felt nothing at all.
Still in her ship, Janet frowned at the viewscreen. The winking marker on the deep radar image had just stopped winking.
“Basalom, get that back on the screen,” she ordered. They had stayed in orbit long enough to run a quick scan for her learning machines, and they had scored a hit almost immediately.
“We have lost the signal, Mistress,” the robot replied.
“Lost the signal? How could we lose the signal? All three of them were coming in loud and clear just a second ago.”
“I don’t know, Mistress, but we are no longer receiving the learning machines’ power signatures.” Basalom worked at the controls for a moment, watching a panel-mounted monitor beside them. Presently he said, “Diagnostics indicate that the problem is not in our receiving equipment.”
“It has to be. They couldn’t just stop. Those are their power packs we’re tracking.”
“Perhaps they’ve shielded them somehow,” Basalom suggested.
“From neutrino emission? Not likely.”
“That is the only explanation. Unless, of course…”
“Unless what?” Janet demanded. She knew why Basalom had paused; he always had trouble delivering news he thought might disturb her. It was a consequence of his ultrastrong First Law compulsion to keep her from harm, one that Janet continually wondered if she had made a mistake in enhancing quite so much. “Out with it,” she ordered.
“Unless they ceased functioning,” Basalom finally managed.
“Impossible. All three, all at once?” Janet shook her head, gray-blond hair momentarily obscuring her eyes until she shoved it aside. “The odds against that are astronomical.”
“Nonetheless,” Basalom persisted, now that he had been ordered to do so, “only shielding or cessation of function could explain their disappearance from the tracking monitor.”
Janet’s only answer was to scowl at the screen again. She ran her hands through her hair again, then asked, “Did you get an exact fix on their location before we lost contact?”
“I did, Mistress.”
“Good. Take us down somewhere close. I want to go have a look.”
“That would be unwise,” Basalom protested. “If they did cease functioning, it might have been the result of a hostile act. It would be foolish to go into the same area yourself.”
Janet hated being coddled by her own creations, but she hadn’t lived to have gray hair by taking stupid risks, either, and Basalom was right. Going into an area where something might have destroyed three robots was a stupid risk.
“Okay,” she said. “Take us down a little farther away, then. And once we’re down, you can go have a look.”