Kresh made a thoughtful little noise in his throat. “That’s what it comes down to, isn’t it?” He considered for a moment, and then went on. “Of course, the traditional Spacer response would be to do nothing at all,” said Kresh. “Let it alone, let it pass. If there is no way to know if it would be better to act, why then far better to leave the thing alone. If you do nothing, then there is nothing you can be blamed for if things go wrong.”
“Another proud legacy of the Three Laws,” Fredda said. “Be safe, do nothing, take no chances.”
“If the Three Laws teach humans to avoid taking needless risks now and again, I for one see that as a very strong argument in their favor,” said Donald, speaking for the first time. “But even the First Law contains an injunction against inaction. A robot cannot stand idly by. It must act to prevent harm to humans.”
Kresh looked toward Donald with a smile. “Are you saying that a robot faced with this decision would choose to bring down the comet? Is that what you would do?”
Donald held up his hands palm out and shook his head back and forth vigorously. “By no means, Governor. I am quite literally incapable of making this decision. It would be a physical impossibility for me to do it, and more than likely suicidal to attempt it. So it would be for any properly constructed Three Law robot.”
“How so?”
“The First Law enjoins us against doing harm to humans, and against inaction at such times when robotic action would prevent harm to humans.” Donald’s speech became labored as he spoke. It was plain that even discussing the issues in a hypothetical context was difficult for him. “In this case, both action or inaction might or might not cause or prevent harm to humans. Attempting to deal with such a difficult problem, with the lives of so many present and potential humans in the bal-balance would cause…would cause irreparable damage to any pospospositronic brain, as the question produced cascading First-Law/First-Law conflictzzz.” Donald’s eyes grew a bit dimmer, and his movements seemed oddly sluggish as he put his arms back down at his side.
“All right, Donald,” said Kresh, in his firmest and most reassuring voice. He stepped over to the robot and put his hand on Donald’s sloping shoulder. “It’s all right. You are not the one who will have to make that decision. I order you to stop considering it at this time.” There were times when only the words of a robot’s direct master could be enough to snap the robot out of such a state.
Donald’s eyes faded out all but completely for a moment, and then came back to their normal brightness. He seemed to be looking at nothing at all for a few seconds, but then his eyes began to track again, and he looked straight at Kresh. “Thank-thank you, sir. It was most unwise of me to consider the question so closely, even when called upon to do so.”
Kresh nodded absently, knowing that he had brought it on himself. He had asked Donald why a robot could not make such a decision, and a question was, in essence, an order. It required constant caution, endless care, to deal with the delicacy of a Three-Law robot’s sensibilities and sensitivities. Sometimes Kresh was deeply tired of it all. There were even times when he was ready to concede that the Settlers might have a point. Maybe some parts of life would be easier without robots.
Not as if they had such an option at the moment. But if robots could not be trusted to face such a situation…Kresh turned toward Donald again. “Donald, I hereby order you to turn around and face the wall, and to shut off all your audio inputs until you see my wife or me waving at you. Do you understand?”
“Yes, sir. Of course.” Donald turned his back on Kresh and Fredda. “I have now shut down my audio receptors.”
“Very good,” said Kresh. More damn fool precautions, but that couldn’t be helped. At least now Donald would be unable to hear or eavesdrop. Now they would be able to talk without fear of saying the wrong thing in front of the robot and accidentally setting up a damn fool First Law crisis. Kresh turned toward Fredda. “What about the Robotic Planetary Control Center?” he asked. “I wanted to consult with it-and with the Computational Planetary Control Center-before I reached a decision.”
“Well, what about them?” Fredda asked.
The two control centers were the heart of the reterraforming effort, performing all the calculations and analyses of each new project before it was launched. The original intent had been to build a single control center. There were two basic designs to choose between. One was a Settler-style computational unit, basically a massively complex and powerful, but quite nonsentient, computer. The other was a Spacer-style robotic unit that would be based on a hugely powerful positronic brain, fully imbued with the Three Laws. It would, in effect, be a robot mind without a robot body.
There had been a tremendous controversy over whether to trust the fate of the planet to a mindless machine, or to a robotic brain that would refuse to take necessary risks. It was easy to imagine a robotic control unit choosing to avoid harm to one human, rather than permit a project vital to the future of the planet. The robotics experts all promised that it didn’t work that way, but experts had been wrong before. Governor Grieg had died before he could reveal his choice between the two systems. In one of the first acts of his administration, Kresh had decided to build both, and interconnect them so that the two systems worked in consensus with each other. In theory, if the two systems could not reach agreement on what to do, or not to do, they were to call in human referees to decide the issue. In practice, the two systems had agreed with each other far more often than anyone could have hoped. Thus far, there had only been a half dozen or so very minor issues that had required human decisions.
A vast planetary network of sensors and probes, orbiting satellites, mobile units, and on-site investigators, both robotic and human, fed a constant stream of information to both units-and both units fed back a constant stream of instructions and commands to the humans and robots and automatic machines in the field.
The two interconnected control centers were the only devices on the planet capable of handling the constant stream of incoming data and outgoing instructions. It was plainly obvious that the two of them would have to be consulted regarding the plan to drop a comet on the planet, but Kresh did not wish to risk the sanity of the robotic unit. “You saw what just happened to Donald,” he said. “Will I burn the Robotic Center out if I ask it what I should do?”
Fredda smiled reassuringly. “There wouldn’t be much point in having a Robotic Control Center that couldn’t consider risks to the planet without damaging itself,” she said. “It took some doing, but we installed some special…safeguards, shall we say, that should keep it from experiencing any serious First Law conflict.”
“Good, good,” said Kresh, a bit absently. “At least that’s one less thing to worry about. At least we know that part is all right.”
“Do we?” Fredda asked. “I wonder. When Lentrall asked me about Donald’s name, and how it was not from Shakespeare, that made me wonder.”
“Wonder what?”
“I was absolutely certain it was from Shakespeare. No doubt at all in my mind. I never bothered to double-check, any more than I would have bothered to double-check the spelling of my own name. I thought I knew it-and I was dead wrong.”
“We all make mistakes,” Kresh said.
“Yes, yes, of course,” Fredda said, impatiently. “But that’s not the point. In a sense, it’s a trivial mistake. But it came out of a trusted database. Who knows how long ago the dataset got scrambled, or what else in it is wrong? And if that database can be wrong, lots of other things can be as well. What else is there that we think we know? What other hard fact that we think we have absolutely right will turn out to be absolutely dead wrong? What else do we think we know?”