“Uh, what stage of compaction are HARLIE’s judgment boxes?” Clintwood again.
“It’s adjustable, depending on the precision HARLIE wants to bring to any one problem. Or needs to. It’s a matter of how many times a decision can be subdivided before such precision becomes redundant. He has a judgment unit to control it.”
Clintwood nodded and scratched something on his notepad.
Elzer remained unimpressed. “It’s still a computer, isn’t it?”
Auberson looked at him, frustrated by the man’s inability to understand. “Yes, in the same sense that your brain is equivalent to a toad’s.”
The reaction was immediate, a chorus of disapproving remarks. One voice, Dome’s, louder than the rest, kept insisting, “Here now!-Here now! We’ll have quiet.” As the noise subsided, he continued. “Auberson, if you can’t keep your personal opinions out of this—”
“Mr. Dome — Chairman Dome — I did not mean the comment as an insult to Mr. Elzer. I was assuming that Mr. Elzer’s brain was better, more complex than a toad’s. Assuming that he has an average human brain, he is as far above a toad as HARLIE is above a simplified autopilot judgment circuit.”
The room quieted somewhat. “However,” Auberson went on, “if Mr. Elzer feels that there is not enough difference between his brain and that of a toad, I’ll have to use some other comparison — hopefully one not so open to misinterpretation. Did you get all that, Miss Stimson?”
Miss Stimson, the Executive Secretary, looked up at him, eyes twinkling. She had gotten it.
“There is a significant difference that I might note,” he added, spacing out his words carefully. “HARLIE uses all of his brain…” Auberson waited to see if Elzer would rise to this; he didn’t. “Estimates vary, but we figure that the average human being uses only ten to fifteen percent of his available brain cells. We couldn’t afford that kind of luxury with HARLIE, so we built him to use his total brain capacity. He’s not as complex as a human brain — he has nowhere near the same number of “cells,” — but he can still function quite well at human levels. Building HARLIE taught us quite a bit about the workings of the human brain. In fact, we were surprised to find out that in many ways it’s simpler than we thought it was.
“HARLIE’s the result of a very foresighted decision made several years ago to explore the possibilities of judgment circuitry as thoroughly as possible. I’m sure I don’t have to comment on the wisdom of that decision. An on-off circuit can’t do the things a variable pattern can. It’s only the Mark IV unit that’s given us a serious piece of the computer market. That’s why we have to keep pushing. If we ever want to catch up with IBM — and such a thing is not impossible — if we ever want to catch up, we need to be the front-runner in judgment circuits. We have to continue with the HARLIE project.”
“Why?” asked Elzer. “Certainly we can continue producing judgment circuits without HARLIE.”
“We can — but that’s the sure and certain road to corporate oblivion. Look, the Thorsen Auto-Pilot is a fine little unit; it can’t be disparaged. But it’s only the equivalent of an IBM Pixie Desktop Calculator. It isn’t any more complex than that. If we want to catch up, we have to go after their JuggerNaut Series. That’s what HARLIE was originally supposed to be — the ultimate in self-programming computers.
“When Handley came on the project, though, its direction changed; the goal became even more lofty. Or maybe I should say, the way to achieve the goal involved an even greater challenge than we had originally thought. Look, you have to understand what Don was up to before he came here. He’d been doing research with a neuro-psychology team down in Houston; they’d been diagramming the basic pattern structures of the human brain. Have you ever seen the schematic of a thought? Don has. Do you know how to program a human brain? Don does. That’s what he was working on before he came here. Anyway, when they started to design HARLIE — he was called JudgNaut One then — Handley was struck by the similarity of the schematics to those of the human brain. The basic judgment paths were too much alike for the thought patterns not to be similar.
“Because the basic structures were so similar in function, Handley felt — and Digby concurred with him — that what they were building was indeed a human brain. Electronic parts, if you will, but undeniably human. Once that was realized, they worked specifically toward that end. Don sent to Houston for his notes, and soon they had a basic schematic of the total machine they wanted. They called it HARLIE and it was to be a self-programming, problem-solving device.”
“You say, ‘it was to be,’ ” said Elzer. “Isn’t it?”
“It is and it isn’t. It isn’t what the JudgNaut was supposed to be, no. But a human brain is a self-programming, problem-solving device — so they did meet the specifications of the original problem.”
“And what were you hired for? To be its baby-sitter?”
“To be its mentor. His mentor,” he corrected.
“Same thing,” snorted Elzer.
“I was brought onto the project as soon as it was realized that HARLIE would be human. Don and I worked together to plan his programming. Don was concerned with how he would be programmed — I was concerned with what.”
“Sort of a mechanical godfather,” said Elzer.
“If you will. Somebody had to guide HARLIE and plan for his education. At the same time, we’re learning quite a bit about human and mechanical psychologies. By the time HARLIE went operational, I thought I had a year’s worth of lesson plans to work with. He went through them in three months, and ever since we’ve been trying to catch up. HARLIE has no trouble at all with rote work; it’s when we get to the human stuff that we start bogging down. I don’t know whether we’re losing him or he’s losing us.”
“If you don’t know what you’re doing,” interrupted Elzer, “then how did you ever get to be in charge of the project?”
Auberson decided to ignore that. “When Digby died it was a choice between myself and Handley. We flipped a coin because it didn’t make much difference to either of us. I lost.”
His flippancy was wasted on Elzer. “You mean you don’t want the job?”
Auberson could see what was coming. But he said, “Not exactly. It’s just that there’s so damn much busy work that it keeps me away from my real job — HARLIE.”
Elzer pounced on it anyway. “You see,” he said to the rest of the Board. “This proves my point. We have a man in charge of this project who doesn’t even care about it.”
Auberson was on his feet at that. Dome was saying, “Oh, now wait a minute—”
“When we lost Digby we should have closed it down,” Elzer insisted. “All we have left are Indians and no Chief.”
“Hold on there—” Auberson protested. “You’re misquoting me — I do care about this project. Its all I care about—”
“You don’t seem to be able to handle it though—”
“You don’t even understand what we’re trying to do! How can you—”
“Auberson! Elzer!” Dome’s voice cut through their words. “Cut it out — both of you! This is a business meeting.”
Slightly chastened, but in no way cooled, Auberson continued. “Psychology, Mr. Elzer, is not as cut-and-dried a subject as bookkeeping.” He glanced at Dome. The big man made no sign. Interpreting that as permission to continue, Auberson reseated himself and said, “Robot psychology is still an infant science. We don’t know what we’re doing—” He stopped himself. That was definitely not the way to phrase it “Let me put it another way. We don’t know if what we’re doing is the right thing to do. HARLIE’s psychology is not the same as human psychology.”
“I thought you said HARLIE was human — and that he duplicates every function of the human brain.”
“He is and he does — but how many human beings do you know who are immobile, who never sleep, who have twenty-five sensory inputs, who have eidetic memories, who have no concept of taste or smell or any other organic chemical reactions? How many human beings do you know who have no sense of touch? And no sex life? In other words, Mr. Elzer, HARLIE may originally have had a human psychology, but his environment has forced certain modifications upon it. And on top of that, HARLIE has a most volatile personality.”