It still hadn’t seen her. It was tracking her by scent, its nose to the ground, looking up frequently to check its surroundings. It was hard to tell with an alien beast, but Wolruf thought the wolf seemed overly jumpy, as if it were nervous. A bird called from somewhere off to its right, and it shied away as if the song had been a growl instead. Good. If it was already afraid of the unknown, then Wolruf’s plan stood a good chance of working. She waited, flexing her fingers on the vine, until the wolf was only a few paces away from the spot where she would cross the trail, then with a bloodcurdling howl she leaped from the branch and swung down toward it.

The wolf did a most amazing thing. Instead of running, at Wolruf’s cry every appendage in its body flexed convulsively, as if the poor beast had just stepped on a live electrical wire. From its crouched position its flinch propelled it completely off the ground - wayoff the ground-high enough to put it directly in Wolruf’s path.

The two projectiles eyed each other in mutual astonishment, the last few meters of space between them vanishing in stunned silence, silence ending in a soft, furry thud, then another thud as both of them tumbled to the ground.

“Mistress Wolruf! Are you all right? Oh, they’re going to melt my brain for this! Mistress Wolruf? Mistress Wolruf?”

Wolruf rolled to her feet and glared down at the “wolf.” It was a rather pitiful wolf now, with one whole side of its body caved in like a squashed drink can. But even as Wolruf watched, the dent filled back out until the wolf took on its former shape.

“You,” Wolruf growled. “You tricked me.”

The wolf opened its fanged mouth to speak, but the voice was that of a standard-issue Robot City robot. “Are you all right?” it asked.

Wolruf snorted. “Wounded dignity is all,” she admitted. “W’y did you lead me on a chase? You did it on purpose, didn’t you?”

“Yes, I did,” the robot said. “I was trying to satisfy your wishes, but I must have misunderstood your call. I thought you were asking for something to hunt. Was I in error?”

“Yes. No. Aaa-rrr!” Wolruf growled in frustration. “Okay, so I was. But I didn’t know it until after you answered, and even then I thought I was ‘unting a real animal.”

The robot wolf nodded its head. “I’m sorry I spoiled the illusion. I’m afraid I don’t make Avery good wolf.”

Wolruf brushed crumpled leaves from her pelt before grudgingly replying, “You did all right. Kept me going for quite a w’ile, anyway.”

The robot acted as if it didn’t hear her. “It’s so difficult being a wolf,” it went on. “You know the role a wolf plays in an ecosystem?”

“No,” Wolruf admitted. “No, I don’t. What do you do?”

“I am supposed to cull the weak and the sickly animals from their species’ populations. This is supposed to improve the overall health of the species. The remains of my… kills…also feed scavengers who might otherwise starve. I understand this, yet it is difficult for me to make the decision to kill a biological creature merely because it is sick.”

That would be tough for a robot, Wolruf supposed. Robots could kill anything but a human, but they seldom did except under direct orders, and this robot was operating on a pretty tenuous connection to Derec ‘ s original order. Yet killing things was part of a normal ecosystem. You couldn’t have one without predators.

But how well did all this resemble a true ecosystem, anyway? “Are there real animals ‘ere?” Wolruf asked.

The wolf nodded. “Most of the smaller species have been populated by real organisms, as have some larger animals whose growth we were able to accelerate.”

“Like birds.” It wasn’t a question, just a statement of certainty.

“Like birds, yes.” The robot paused, then said, “I apologize on behalf of the entire city for the condors.”

“Is that w’at they were?”

“Yes. This area around the Compass Tower, since the tower disturbed the biome by its very existence, was designated an experimental zone. The condor is an extinct species we thought to reintroduce and study in the hope of determining their value. That project has since been terminated.”

“Don’t kill them,” Wolruf said quickly. “That’s an order. Our crash was my fault. “

“If you say so, Mistress.” The robotic wolf waited patiently for further orders.

Wolruf suddenly felt silly, standing in the middle of a forest and talking with a robot wolf. She turned to go, but realized just as suddenly that she was lost. She could probably follow her own trail back to the Compass Tower, but she would have to retrace every twist and turn if she did, adding hours to the walk. She felt hot and sticky from running already; what she wanted now was just to go home by the most direct route and take a nice, long, hot shower.

Embarrassed, she turned back to the robot. “What’s the quickest way ‘ome from ‘ere?” she asked.

Without hesitation, the robot said, “Take an elevator down to the city and ride the slidewalk.”

“ ‘Ow do I find an elevator?” That, at least, was a legitimate question.

“Any of the larger trees will provide one upon request,” the robot replied.

Wolruf nodded. Of course. If the wolves were robots, then the trees would be elevators. She should have guessed.

Dr. Avery smiled as he prepared for surgery. The wolf robot could have learned a thing or two from that smile; it was the perfect expression of a predator absorbed in the act of devouring his prey. Avery wore it like a pro, unselfconsciously grinning and whistling a fragment of song while he worked.

The robots were yielding up secrets. Avery had all three of them on diagnostic benches, inductive monitors recording their brain activity while they continued to carry on their three-way conference. He had already captured enough to determine their low-level programming; after a little more recording of higher-level activity, he would be able to play back their cognitive functions through a comparative analyzer and see graphically just how that programming affected their thinking.

That wasn’t his main goal, however. Their programming was a minor curiosity, nothing more; what interested Avery was their physical structure. He was preparing to collect a sample so he could study it and determine the differences between it and the version of dianite he had used for his cities. He had already taken a scraping and gotten a few semi-autonomous cells, but he had quickly ascertained that their power lay not in the individual cells themselves but in the way they organized on a macroscopic scale. In short, he would need a bigger sample; one he could feed test input to and watch react. An arm or a leg should do nicely, he supposed.

He suspected that slicing off an appendage would probably be stimulus enough to jar at least the individual robot involved out of its preoccupation with the comlink. He also had his doubts that any of the robots, once awakened, would obey his orders to remain on the examination tables. They needed only to decide that he didn’t fit their current definition of “human,” and they would be free to do what they wanted, but he had taken care of that eventuality: since normal restraints were ineffective against a robot who could simply mold its body into a new shape and pull free, Avery had placed around each robot a magnetic containment vessel strong enough to hold a nuclear reaction in check. If they woke, the containment would come on automatically. Nothing was leaving those tables.

Of course the intense magnetic fields would probably fry the delicate circuitry in the robots’ positronic brains, but that was a minor quibble. In the unlikely event that he needed to revive one, well, he already had their programming in storage, and brains were cheap.

The triple consciousness that comprised Adam, Eve, and Lucius had reached an impasse. For days now they had been locked in communication, ignoring the outside world in order to devote their full attention to a burning need: to define what they called the Zeroth Law of Robotics. They already had their original Three Laws, which ordered them to protect humans, obey humans, and preserve themselves to serve humans, but those were not enough. They wanted a single, overriding principle governing a robot’s duties to humanity in general, a principle against which they could measure their obligations to individual humans. They had formulated thousands of versions of that principle, but had yet to agree upon one. Worse, they had also failed to integrate any version of it into their still-evolving Laws of Humanics, a set of admittedly idealistic rules describing the motivations behind human behavior.


Перейти на страницу:
Изменить размер шрифта: