“Not enough to hurt you,” his mother said patiently. “Not when he’s switched into standby mode like this. Would you like me to do it?”
“No, I’ll get it.” Derec reached inside again, but stopped when he heard Wolruf’s laugh. He looked up and saw her in the doorway.
“‘Ello.”
“Hi.” Grinning, Derec withdrew his hand from the robot and used it to gesture. “Mom, this is my friend Wolruf. Wolruf, this my mother, Janet Anastasi.”
“Pleased to meet you,” Wolruf said, stepping forward and holding out a hand.
Janet looked anything but pleased to be so suddenly confronted with an alien, but she swallowed gamely and took the proffered appendage. “Likewise,” she said.
Wolruf gave her hand a squeeze and let go. Looking over Janet’s shoulder, she noticed a huddle of four robots in the far corner of the lab: three learning machines and Mandelbrot. They looked to be in communications fugue. Nodding toward them, she said, “I ‘eard Lucius ‘urt Avery some’ow.”
“That’s right,” Ariel said. “He was trying to protect Basalom, here. We’ve got him in psychotherapy, if you can call four robots in an argument psychotherapy. They’re trying to convince him it’s all right.”
“It is?” Wolruf asked.
“Well, not the actual act,” Derec said, “but the logic he used wasn’t at fault. He just made a mistake, that’s all. He thought he was protecting a human.” Derec outlined the logic Lucius had used, including the First and Zeroth Law considerations that had finally made him do what he’d done.
Wolruf listened with growing concern. The Zeroth Law was just the thing she’d hoped for to reassure her that taking robots home with her wouldn’t destroy her homeworld’s society, but if that same law let a robot injure its master, then she didn’t see how it could be a good thing.
“I don’t know,” she said. “Sounds like a bad tradeoff to me.”
“How so?” Janet asked.
“I’m wondering ‘ow useful all this is going to be. Right now I’m not sure about regular robots, much less ones who think they’re ‘uman.”
“What aren’t you sure about?”
Was Derec’s mother just being polite, or did she really want to know? Wolruf wondered if this was the time to be getting into all this, to bring up the subject of her going home and to get. into all her reasons for hesitating, but she supposed there really wasn’t going to be a much better time. She knew what Derec and Ariel thought about the subject; maybe this Janet would have something new to say. “I’m not sure about taking any of these robots ‘ome with me,”
Wolruf said. “I’m not sure about w’at they might decide to do on their own, and I’m not sure about w’at might ‘appen to us even if they just follow orders.”
“I don’t understand.”
“She’s talking about protecting people from themselves,” Ariel said.
“Am I?”
“Sure you are. I’ve been thinking about it, too. The problem with robot cities is that they’re too responsive. Anything you want them to do, they’ll do it, so long as it doesn’t hurt anybody. The trouble is, they don’t reject stupid ideas, and they don’t think ahead. “
“That’s the people’s job,” Janet said.
“Just w’atone of the robots in the forest told me,” Wolruf said. “Trouble is, people won’t always do it. Or w’en they realize they made a mistake, it’ll be too late.”
Janet looked to Derec. “Pessimistic lot you run around with.”
“They come by it honestly,” he said, grinning. “We’ve been burned more than once by these cities. Just about every time, it’s been something like what they’re talking about. Taking things too literally, or not thinking them through.”
“Isn’t Central supposed to be doing that?”
“Central is really just there to coordinate things,” Derec said. “It’s just a big computer, not very adaptable.” He looked down at Basalom again, nodded to Ariel to have her shine the light inside again as well, and peered inside the robot’s shoulder. After a moment he found what he was looking for, reached gingerly inside, and grunted with the strain of pushing something stubborn aside. The something gave with a sudden click and the stump of the robot’s arm popped off, trailing wires.
“There’s also a committee of supervisory robots,” Ariel said, “but they don’t really do any long-range planning either. And they’re all subject to the Three Laws, so anybody who wants to could order them to change something, and unless it clearly hurt someone else, they’d have to do it.”
“No matter how stupid it was,” Janet said.
“Right.” Derec unplugged the wires between Basalom’s upper arm and the rest of his body.
Janet looked thoughtful. “Hmmm,” she said. “Sounds like what these cities all need is a mayor. “
“Mayor?” Wolruf asked.
“Old human custom,” Janet replied. “A mayor is a person in charge of a city. He or she is supposed to make decisions that affect the whole city and everyone in it. They’re supposed to have the good of the people at heart, so ideally they make the best decisions they can for the largest number of people for the longest period of time. “
“Ideally,” Wolruf growled. “We know ‘ow closely people follow ideals.”
“People, sure.” Janet waved a hand toward the four robots in the comer. “But how about dedicated idealists?”
Ariel was so startled she dropped the light. It clattered to the floor and went out, but by the time she bent down to retrieve it, it was glowing again, repaired.
“Something wrong, dear?” Janet asked her.
“You’d let one of them be in charge of a city?”
“Yes, I would.”
“And you’d live there?”
“Sure. They’re not dangerous.”
“Not dangerous! Look at what-”
“Lucius made the right decision, as far as I’m concerned.”
“Maybe,” Ariel said. “What worries me is the thought process he went through to make it.” She clicked off the light; Derec wasn’t working on Basalom anymore anyway. He was staring at Ariel and Janet as if he’d never heard two people argue before. Ariel ignored his astonished look and said, “The greatest good for the greatest number of people. That could easily translate to ‘the end justifies the means., Are you seriously suggesting that’s a viable operating principle?”
“We’re not talking an Inquisition here,” Janet said.
“But what if we were? What if the greatest good meant killing forty-nine percent of the population? What if it meant killing just one? Are you going to stand there and tell me it’s all right to kill even one innocent person in order to make life easier for the rest?”
“Don’t be ridiculous. That’s not what we’re talking about at all. “
It took conscious effort for Ariel to lower her voice. “It sure is. Eventually that sort of situation is going to come up, and it scares the hell out of me to think what one of those robots would decide to do about it. “
Janet pursed her lips. “Well,” she said, “why don’t we ask them, then?”
Lucius looked for the magnetic containment vessel he was sure must be waiting for him somewhere. Not finding one, he looked for telltale signs of a laser cannon hidden behind one of the walls. He didn’t find that, either, but he knew there had to be something he couldn’t see, some way of instantly immobilizing him if he answered wrong. The situation was obviously a test, and the price of failure was no doubt his life.
He’d been roused out of comlink fugue and immediately barraged with questions, the latest of which was the oddest one he’d ever been asked to consider, even by his siblings.
“Let me make sure I understand you,” he said. “The person in question is not a criminal? He has done no wrong? Yet his death would benefit the entire population of the city?”
“That’s right.”
Ariel’s stress indicators were unusually high, but Lucius risked his next question anyway. “How could that be?”
“That’s not important. The important thing is the philosophical question behind it. Would you kill that person in order to make life better for everyone else?”