II

Judith O’Neill served hot black coffee to the people sitting around the living room. Her husband talked while the others listened. O’Neill was as close to being an authority on the autofac system as could still be found.

In his own area, the Chicago region, he had shorted out the protective fence of the local factory long enough to get away with data tapes stored in its posterior brain. The factory, of course, had immediately reconstructed a better type offence. But he had shown that the factories were not infallible.

“The Institute of Applied Cybernetics,” O’Neill explained, “had complete control over the network. Blame the war. Blame the big noise along the lines of communication that wiped out the knowledge we need. In any case, the Institute failed to transmit its information to us, so we can’t transmit our information to the factories—the news that the war is over and we’re ready to resume control of industrial operations.”

“And meanwhile,” Morrison added sourly, “the damn network expands and consumes more of our natural resources all the time.”

“I get the feeling,” Judith said, “that if I stamped hard enough, I’d fall right down into a factory tunnel. They must have mines everywhere by now.”

“Isn’t there some limiting injunction?” Ferine asked nervously. “Were they set up to expand indefinitely?”

“Each factory is limited to its own operational area,” O’Neill said, “but the network itself is unbounded. It can go on scooping up our resources forever. The Institute decided it gets top priority; we mere people come second.”

“Will there be anything left for us?” Morrison wanted to know.

“Not unless we can stop the network’s operations. It’s already used up half a dozen basic minerals. Its search teams are out all the time, from every factory, looking everywhere for some last scrap to drag home.”

“What would happen if tunnels from two factories crossed each other?”

O’Neill shrugged. “Normally, that won’t happen. Each factory has its own special section of our planet, its own private cut of the pie for its exclusive use.”

“But it could happen.”

“Well, they’re raw material-tropic; as long as there’s anything left, they’ll hunt it down.” O’Neill pondered the idea with growing interest. “It’s something to consider. I suppose as things get scarcer—”

He stopped talking. A figure had come into the room; it stood silently by the door, surveying them all.

In the dull shadows, the figure looked almost human. For a brief moment, O’Neill thought it was a settlement latecomer. Then, as it moved forward, he realized that it was only quasi-human: a functional upright biped chassis, with data-receptors mounted at the top, effectors and proprioceptors mounted in a downward worm that ended in floor-grippers. Its resemblance to a human being was testimony to nature’s efficiency; no sentimental imitation was intended.

The factory representative had arrived.

It began without preamble. “This is a data-collecting machine capable of communicating on an oral basis. It contains both broadcasting and receiving apparatus and can integrate facts relevant to its line of inquiry.”

The voice was pleasant, confident. Obviously it was a tape, recorded by some Institute technician before the war. Coming from the quasi-human shape, it sounded grotesque; O’Neill could vividly imagine the dead young man whose cheerful voice now issued from the mechanical mouth of this upright construction of steel and wiring.

“One word of caution,” the pleasant voice continued. “It is fruitless to consider this receptor human and to engage it in discussions for which it is not equipped. Although purposeful, it is not capable of conceptual thought; it can only reassemble material already available to it.”

The optimistic voice clicked out and a second voice came on. It resembled the first, but now there were no intonations or personal mannerisms. The machine was utilizing the dead man’s phonetic speech-pattern for its own communication.

“Analysis of the rejected product,” it stated, “shows no foreign elements or noticeable deterioration. The product meets the continual testing-standards employed throughout the network. Rejection is therefore on a basis outside the test area; standards not available to the network are being employed.”

“That’s right,” O’Neill agreed. Weighing his words with care, he continued, “We found the milk substandard. We want nothing to do with it. We insist on more careful output.”

The machine responded presently. “The semantic content of the term ‘pizzled’ is unfamiliar to the network. It does not exist in the taped vocabulary. Can you present a factual analysis of the milk in terms of specific elements present or absent?”

“No,” O’Neill said warily; the game he was playing was intricate and dangerous. “ ‘Fizzled’ is an overall term. It can’t be reduced to chemical constituents.”

“What does ‘pizzled’ signify?” the machine asked. “Can you define it in terms of alternate semantic symbols?”

O’Neill hesitated. The representative had to be steered from its special inquiry to more general regions, to the ultimate problem of closing down the network. If he could pry it open at any point, get the theoretical discussion started…

“ ‘Pizzled,’ ” he stated, “means the condition of a product that is manufactured when no need exists. It indicates the rejection of objects on the grounds that they are no longer wanted.”

The representative said, “Network analysis shows a need of high-grade pasteurized milk-substitute in this area. There is no alternate source; the network controls all the synthetic mammary-type equipment in existence.” It added, “Original taped instructions describe milk as an essential to human diet.”

O’Neill was being outwitted; the machine was returning the discussion to the specific. “We’ve decided,” he said desperately, “that we don’t want any more milk. We’d prefer to go without it, at least until we can locate cows.”

“That is contrary to the network tapes,” the representative objected. “There are no cows. All milk is produced synthetically.”

“Then we’ll produce it synthetically ourselves,” Morrison broke in impatiently. “Why can’t we take over the machines? My God, we’re not children! We can run our own lives!”

The factory representative moved toward the door. “Until such time as your community finds other sources of milk supply, the network will continue to supply you. Analytical and evaluating apparatus will remain in this area, conducting the customary random sampling.”

Ferine shouted futilely, “How can we find other sources? You have the whole setup! You’re running the whole show!” Following after it, he bellowed, “You say we’re not ready to run things—you claim we’re not capable. How do you know? You don’t give us a chance! We’ll never have a chance!”

O’Neill was petrified. The machine was leaving; its one-track mind had completely triumphed.

“Look,” he said hoarsely, blocking its way. “We want you to shut down, understand. We want to take over your equipment and run it ourselves. The war’s over with. Damn it, you’re not needed anymore!”

The factory representative paused briefly at the door. “The inoperative cycle,” it said, “is not geared to begin until network production merely duplicates outside production. There is at this time, according to our continual sampling, no outside production. Therefore network production continues.” Without warning, Morrison swung the steel pipe in his hand. It slashed against the machine’s shoulder and burst through the elaborate network of sensory apparatus that made up its chest. The tank of receptors shattered; bits of glass, wiring and minute parts showered everywhere.

“It’s a paradox!” Morrison yelled. “A word game—a semantic game they’re pulling on us. The Cyberneticists have it rigged.” He raised the pipe and again brought it down savagely on the unprotesting machine. “They’ve got us hamstrung. We’re completely helpless.”


Перейти на страницу:
Изменить размер шрифта: