Donald hesitated a full half second before responding. Kresh thought he could almost see the First and Second Law potentials battling it out with each other. “Yes, sir,” he said at last, and handed over the phone.
It was a sign of just how rattled Donald was that he would kick up such a fuss over such a minor point. The sight of the Governor’s corpse had upset man and robot. Both of them knew that was not merely a dead man-it was, in all probability, a whole planet suddenly thrown into peril.
With a beep and a click-tone the phone line connected. “Um-um-hello?”
Kresh recognized Fredda’s voice, sleepy and a bit muddled. “Dr. Leving, this is Sheriff Kresh. I’m afraid I must ask you to return to the Residence immediately, and to bring whatever technical equipment you have with you. I need you to examine some, ah-damaged robots. ” It was a clumsy way to put it, but Kresh couldn’t think of anything else he could say on an unsecured line.
“What?” Fredda asked. “I’m sorry, what did you say?”
“Damaged robots,” Kresh repeated. “I need you to perform a fast discreet examination. It is a matter of some urgency. ”
“Well’ all right, I suppose, if you say it’s urgent. It will take me a while to get to the Limbo Depot robotics lab and collect some examination gear. I didn’t bring anything with me. I’ll get there as fast as I can.”
“Thank you, Doctor. “ Kresh handed the handset back to Donald. “Well?” he asked.
“Sir, I withdraw my objections. You were indeed correct. My voice-stress monitoring indicated no undue reaction to a call from you from the Residence at this hour. Either she has no idea whatsoever of what has happened, or she is a superb actress-an accomplishment of which I am unaware in Dr. Leving.”
“Once in a while, Donald,” Kresh said, “you might try taking my word on questions of human behavior.”
“Sir, with all due respect, I have found no topic of importance wherein questions so utterly outnumber the answers. ” Kresh gave the robot a good hard look. Had Donald just made a joke?
Prospero, Fredda told herself as she hurried to get ready. It had to be something to do with Prospero. Why else would Kresh be there at this hour, and calling her in? Something must have gone wrong with Prospero. Fredda Leving had hand-built the New Law robot, and programmed his gravitonic brain herself. She remembered how much of a pleasure it had been to work on the empty canvas of a gravitonic unit, with the chance to make bold strokes, work out whole new solutions, rather than being strait-jacketed by the limitations and conventions and excessive safety features of the positronic brain.
Ever since the long-forgotten day when true robots had first been invented, every robot ever built had been given a positronic brain. All the endless millions and billions of robots made in all those thousands of years had relied upon the same basic technology. Nothing else would ever do. The positronic brain quite literally defined the robot. No one would consider any mechanical being to be a robot unless it had a positronic brain-and, contrariwise, anything that contained a positronic brain was considered to be a robot. The two were seen as inseparable. Robots were trusted because they had positronic brains, and positronic brains were trusted because they went into robots. Trust in robots and in positronic brains were articles of faith.
The Three Laws were at the base of that faith. Positronic brains-and thus robots built with such brains-had the Three Laws built into them. More than built in: They had the Laws woven into them. Microcopies of the Laws were everywhere inside a positronic brain, strewn across every pathway, so that every action, every thought, every external event or internal calculation moved down pathways shaped and built by the Laws.
Every design formula for the positronic brain, every testing system, every manufacturing process, was built with the Three Laws in mind. In short, the positronic brain was inseparable from the Three Laws-and therein lay the problem.
Fredda Leving had once calculated that thirty percent of the volume of the average positronic brain was given over to pathing linked to the Three Laws, with roughly a hundred million microcopies of the Laws embedded in the structure of the average positronic brain, before any programming at all was done. As roughly thirty percent of positronic programming was also given over to the Three Laws, the case could be made that every one of those hundred million microcopies was completely superfluous. Fredda’s rough estimate was that fifty percent of the average robot’s nonconscious and preconscious autonomous processing dealt with the Laws and their application.
The needless, excessive, and redundant Three-Law processing resulted in a positronic brain that was hopelessly cluttered up with nonproductive processing and a marked reduction of capacity. It was, as Fredda liked to put it, like a woman forced to interrupt her thoughts on the matter at hand a thousand times a second in order to see if the room were on fire. The excessive caution did not enhance safety, but did produce drastically reduced efficiency.
But everything in the positronic brain was tied to the Three Laws. Remove or disable even one of those hundred million microcopies, and the brain would react. Disable more than a handful, and the brain would fail altogether. Try and produce positronic programming that did not include endless redundant checks for First, Second, and Third Law adherence, and the hardwired, built-in copies of the Three Laws would cause the positronic brain to refuse the programming and shut down.
Unless you threw out millennia of development work and started from scratch with a lump of sponge palladium and a hand calculator, there was no way to step clear of the ancient technology and produce a more efficient robot brain.
Then Gubber Anshaw invented the gravitonic brain. It was light-years ahead of the positronic in processing speed and capacity. Better still, it did not have the Three Laws burned into its every molecule, cluttering things up. The Three Laws could be programmed into the gravitonic brain, as deeply as you liked, but with only a few hundred copies placed in the key processing nodes. In theory, it was more liable to failure than the millions of copies in a standard positronic brain. In practice, the difference between ten billion to one and ten trillion to one was meaningless. Gravitonic Three-Law brains were, for all purposes, as safe as positronic ones.
But, because the Three Laws were not implicit in every aspect of the gravitonic brain’s design and construction, the other robotics laboratories had refused to deal with Gubber Anshaw or his work. Building a robot that did not have a positronic brain was about as socially acceptable as cannibalism, and no appeal to logic or common sense could make the slightest difference.
Fredda Leving, however, had been more than eager to experiment with the gravitonic brain-but not because she had any interest in improved efficiency. Long before Gubber Anshaw had come to her, she had been brooding over much deeper issues regarding the Three Laws, and the effects they had on human-robot relations-and therefore on humans themselves.
Fredda had concluded, among other things, that the Three Laws stole all human initiative and served to discourage risk to an unhealthy degree by treating the least of risks of minor injury exactly the same as an immediate danger to life and limb. Humans learned to fear all danger, and eschew all activity that had the slightest spice of hazard about it.
Fredda had, therefore, formulated the four New Laws of Robotics, as a matter of mere theory, little realizing that Gubber Anshaw would come along and give her a chance to put it all into practice. Fredda had built the first New Law robots. Tonya Welton had gotten wind of the New Law project, and insisted that New Law robots be used on Purgatory. Welton had liked the idea of robots that were neither slaves nor in control over their masters’ lives. And, perhaps, the fact that she was sleeping with Gubber Anshaw had something to do with it.