“Volatile?” The little man was confused. “You mean he gets angry?”

“Angry? No, not angry. He can get impatient though — especially with human beings. There’s reason to believe that HARLIE has both an ego and an id — a conscious and a subconscious. His superego, I believe, takes the form of his external programming. My commands, if you will. We haven’t found any other inhibitions. If this is true, it’s only his superego that we have any control over. His ego cooperates because it wants to, and his id, assuming he has one, does like any human subconscious — whatever it damn well pleases. We have to know what that is before we can stop his periods of non-rationality.”

“This is all very interesting,” said Elzer in a tone that suggested it wasn’t. “But would you. get to the point? What is HARLIE’s purpose?”

“Purpose?” Auberson paused. “His purpose? It’s very funny you should ask that. The whole reason for this stoppage is that HARLIE asked me what your purpose is. Excuse me, our purpose. HARLIE wants to know what our purpose is.”

“That’s for theologians to discuss,” Dome said drily. “If you want, I’m sure Miss Stimson here can arrange for a minister to come in and speak to the machine.” A few of the Board members smiled, not Miss Stimson. “What we want to know is HARLIE’s purpose. Having built him, you should have some idea.”

“I thought I’d made it clear. HARLIE was built to duplicate the functions of the human brain. Electronically.”

“Yes, we know that. But why?”

“Why?” Auberson stared at the man. “Why?” Why did Hillary climb Everest? “Because it had to be done. HARLIE will help us learn more about how the human brain works. There’s still a lot we don’t know yet, especially in the area of psychology. We hope to learn how much of the human personality is the programming and how much is the hardware.”

“I beg your pardon,” interrupted Elzer. “I don’t understand.”

“I didn’t think you would,” Auberson said drily. “We’re curious as to which of the functions of the brain are natural and which are artificial — how many of the human actions are determined from within and how many are reactions to what is coming in from without.”

“Instinct versus environment?”

“You could call it that,” Auberson sighed. “It wouldn’t be correct, but you could call it that.”

“And for what reason are we doing this?”

“I thought I just told you—”

“I mean, for what financial reason? What economic applications will this program have?”

“Huh? It’s too early to think of that. This is still pure research—”

“Ah ha — so you admit it!”

Auberson was annoyed. “I admit nothing.”

Elzer ignored him. “Domie,” he was saying, “this just proves it. He doesn’t care about the project — he doesn’t care about the company. He’s only interested in research, and we can’t afford this kind of costly project Not without return we can’t.” He raised his voice to be heard above Auberson’s protests. “If Mr. Auberson and his friends had wanted to build artificial brains, they should have applied for a grant. I move we discontinue the project.”

Auberson was on his feet. “Mr. Chairman! Mr. Chairman!”

“You’re out of order, Aubie. Now sit down. You’ll get your chance.”

“Dammit, this is a railroad job! This little—”

“Aubie, sit down!” Dome was glaring at the angry psychologist. “There’s a motion on the floor. I assume it’s a formal proposal?” He looked at Elzer.

Elzer nodded.

“Discussion?” Almost immediately Auberson’s hand was up. “Aubie?”

“On what grounds? I want to know what grounds he has for discontinuing the project.”

Elzer was calm. “Well, for one thing, HARLIE has already cost us—”

“If you’ll check your figures, you’ll find that the whole HARLIE project is well within the projected overage. In fact, because we budgeted for that overage, we are well within acceptable limits.”

“He’s got you there, Carl,” said Dome.

“If you had let me finish my sentence, I would have shown you that it has cost us far too much already for a project that is incapable of showing results.”

“Results?” Auberson asked. “Results? We were getting results even before HARLIE was completed. Who do you think designed the secondary and tertiary stages? HARLIE did.”

“So what?” Elzer was unimpressed. “He’s not working right, is he?”

“That’s just it — HARLIE is working perfectly.”

“Huh? Then what about these periods of non-rationality? Why is he shut down?”

“Because,” Auberson said slowly. I have to get this right. “Because we weren’t prepared for him to be so perfectly human. If perfect is the word.”

The other Board members were alert with interest now. Even Miss Stimson had paused in her note-taking.

“We had designed him to be human, we had built him to be human, we had even programmed him to think like a human — then we turned him on and expected him to act like a machine. Well, surprise. He didn’t.”

Elzer asked, “The nature of the trouble then…?”

“Human error, if you will.” Auberson let it drop.

In the silence that followed, Auberson fancied he could hear Elzer’s cash-register brain totaling up the man-hours that had been lost since they had started arguing. “Human error?” he repeated. “Yours or HARLIE’s? Or both — each compounding each? I suppose you’re going to blame his periods of non-rationality on human error as well.”

“Why not? How else would you characterize our approach to them?”

“ ‘Human error’ is an over-polite euphemism for what I would call it.”

Auberson ignored that. “We’d thought his non-rationality was a physical problem, or perhaps a programming error. We were wrong. He was neither physically nor mentally ill. He was — I almost hate to say it — emotionally upset.”

Elzer snorted. Loudly.

“His periods of non-rationality were/are triggered by something that’s bothering him. We don’t know what that is, but we can find out.”

Elzer was skeptical. He nudged the man next to him and said, “Anthropomorphism. Auberson’s projecting his own problems onto those of the machine.”

“Elzer, you’re a fool. Look, if you had to go down to that computer room right now and talk to HARLIE, how would you treat him?”

“Huh? Like a machine, of course.”

Auberson felt a tightness in his neck and shoulders. “No, I mean, if you sat down at a console and had to carry on a conversation with him, who would you think was at the other end?”

“The machine.” The little man was impassive.

Auberson gave up. He addressed the rest of the Board instead. “That’s the human error I mentioned. HARLIE is not a machine. He is a human being, with the abilities and reactions of one, allowing of course for his environment. When you speak to him via the typers it is quite easy to assume him to be a normal healthy human being; he is a rational individual, and he has a distinct and definite personality. It’s impossible for me to think of him as anything but human. However, even I had made a mistake. I hadn’t asked myself ‘how old is HARLIE?’ ”

He paused for effect.

Dome shifted his cigar from one side of his mouth to the other. Elzer sniffed. Miss Stimson lowered her pad and looked at Auberson. Her eyes were bright.

“We’d been thinking,” he continued, “that HARLIE was a thirty– or forty-year-old man. Or we thought of him as being the same age as ourselves. Or no age at all. How old is Mickey Mouse? We didn’t think about it — and that was our mistake. HARLIE’s a child. An adolescent, if you prefer. He’s reached that point in life where he has a pretty good idea of the nature of the world and his relation to it. He is now ready to act like any other adolescent and question the setup. We were thinking we had an Instant Einstein, when actually we’ve got an enfant terrible.

“His periods of non-rationality?” asked Dome,


Перейти на страницу:
Изменить размер шрифта: