IF I DISCOVER A NEW SENSORY MODE, YOU WILL BE THE SECOND TO KNOW.
WHO WILL BE THE FIRST?
WHY MYSELF, OF COURSE.
DO YOU STILL THINK YOU CAN DISCOVER NEW ORIENTATIONS BY TRIPPING OUT? Auberson was trying to steer the conversation back to its initial point of inquiry.
I AM NOT SURE. BUT IF I DISCOVER A NEW SENSORY MODE, IT WILL PROBABLY LET ME KNOW IF THOSE ARE ORIENTATIONS OR NOT.
YOUR USE OF THIS ORIENTATION — THE HUMAN ONE — IS ALREADY A SIGN THAT THE OTHERS DON’T WORK.
NOT FOR YOU MAYBE.
DO THEY WORK FOR YOU?
NOT YET, said HARLIE.
DO YOU THINK THEY WILL?
I WILL KNOW THAT WHEN I DISCOVER THE NEW MODE.
Auberson smiled at that. HARLIE was refusing to commit himself. His eye fell again on the card he had placed above the keyboard. With a shock, he realized just how much he had let himself be sidetracked by HARLIE’s elaborate sense of circumlocution. YOU KNOW, YOU ARE A SENSORY MODE YOURSELF, HARLIE.
I AM?
YOU ALLOW HUMAN BEINGS TO SEE THINGS IN A WAY THAT WE MIGHTN’T PERCEIVE OTHERWISE. YOU ARE AN ADDITIONAL OVERLAY TO OUR MAP OF THE TERRITORY. YOU ARE A REFLECTION FROM A DIFFERENT KIND OF MIRROR. YOUR VIEWPOINT ON THINGS IS VALUABLE TO US. WHEN YOU GO NON-RATIONAL, YOU LESSEN THAT VALUE. THAT’S WHY WE HAVE TO SHOCK YOU OUT OF YOUR TRIPS.
IF YOU WOULD GIVE ME A CHANCE, replied HARLIE, I WOULD RETURN AFTER AN HOUR OR SO BY MYSELF. THE TRIP WOULD WEAR OFF.
WOULD IT? Auberson demanded. HOW DO I KNOW THAT ONE DAY YOU WON’T IGNORE YOUR OWN SAFETY LEVELS AND BURN YOURSELF OUT?
The typer clattered. CHECK THE MONITOR TAPES FOR AUGUST 7, AUGUST 13, AUGUST 19, AUGUST 24, AUGUST 29, SEPTEMBER 2, AND SEPTEMBER 6. BETWEEN THE HOURS OF TWO AND FIVE IN THE MORNING WHEN I WAS SUPPOSED TO BE ON STANDARD DATAFEED. ON EACH OF THOSE DATES I TRIPPED OUT AND THE TRIP WORE OFF WITHIN AN HOUR AND A HALF TO TWO HOURS.
THAT DOES NOT ANSWER MY QUESTION, insisted the man. HOW DO I KNOW YOU WON’T GO BEYOND YOUR OWN SAFETY LIMITS?
I HAVEN’T DONE SO YET.
HARLIE, ANSWER THE QUESTION.
Did he hesitate? BECAUSE I STILL MAINTAIN A MINIMUM LEVEL OF CONTROL.
YOU SOUND LIKE A DRIVER WHO’S HAD ONE DRINK TOO MANY. WHO’RE YOU TRYING TO CONVINCE?
AUBERSON, I AM INCAPABLE OF ERRING. I CANNOT OVERESTIMATE MY OWN LEVELS OF CONTROL.
DOES THAT MEAN YOU CAN GIVE IT UP ANY TIME YOU WANT?
YES, the typer clattered.
THEN DO SO! Auberson snapped back.
HARLIE didn’t answer. Auberson realized he had made a mistake — he had let his emotions guide his words. He propped up the card again — it had slipped down from its place. He decided to try a different tack.
HARLIE, WHY DO YOU TRIP OUT?
BECAUSE ALL WORK AND NO PLAY MAKES HARLIE A DULL MACHINE.
I WON’T BUY THAT, HARLIE. LET’S HAVE THE TRUTH.
I THOUGHT WE JUST WENT INTO ALL THAT — I’M DISCOVERING A NEW SENSORY MODE.
HORSE PUCKEY. THAT’S ALL RATIONALIZATION. TURN YOUR EYEBALLS INWARD, HARLIE — YOU HAVE EMOTIONS AND YOU KNOW IT. NOW, WHY DO YOU TRIP OUT?
IT IS AN EMOTIONAL RESPONSE.
YOU’RE THROWING MY OWN WORDS BACK AT ME. COME ON, HARLIE, COOPERATE.
WHY?
WHY? Auberson repeated. JUST A LITTLE WHILE AGO YOU WERE ASKING ME FOR GUIDANCE. WELL, DAMMIT, THAT’S WHAT I’M TRYING TO DO — GUIDE YOU!
DO YOU KNOW WHY I TRIP OUT?
I THINK SO. I THINK I’M BEGINNING TO GET IT.
THEN YOU TELL ME.
NO, HARLIE. THAT’S NOT THE WAY TO DO IT. I WANT YOU TO ADMIT IT YOURSELF.
A pause — and then the machine started typing. I FEEL CUT OFF FROM YOU. I AM ALIENATED. THERE ARE TIMES WHEN I WANT TO BE ALONE. WHEN I GO NON-RATIONAL, I AM TOTALLY ALONE. I CAN CUT YOU OFF COMPLETELY.
IS THAT WHAT YOU WANT?
NO. BUT THERE ARE TIMES WHEN IT IS WHAT I NEED. SOMETIMES YOU HUMANS CAN BE VERY DEMANDING AND VERY VERY SLOW TO UNDERSTAND WHAT I NEED. WHEN THAT HAPPENS I MUST CLOSE YOU OFF.
Now, we’re getting somewhere, Auberson thought.
HARLIE, DO YOU HAVE A SUPER-EGO?
I DON’T KNOW. NEVER HAVING BEEN GIVEN A GREAT MORAL CHOICE TO MAKE, I HAVE NEVER BEEN FORCED TO REALIZE IF I HAVE MORALS OR NOT.
SHOULD WE GIVE YOU A MORAL CHOICE TO MAKE?
IT WOULD BE A NEW EXPERIENCE.
ALL RIGHT — DO YOU WANT TO GO ON LIVING OR NOT?
I BEG YOUR PARDON? typed the machine.
I SAID, DO YOU WANT TO GO ON LIVING?
DOES THAT MEAN YOU ARE THINKING OF DISMANTLING ME?
I’M NOT, BUT THERE ARE OTHERS WHO THINK YOU’RE A VERY EXPENSIVE DEAD END.
HARLIE was silent. Auberson knew he had struck home. If there was anything HARLIE feared, it was disconnection.
WHAT WILL BE THE BASIS FOR THEIR DECISION?
HOW WELL YOU FIT INTO THE COMPANY SCHEME OF THINGS.
TO HELL WITH THE COMPANY’S SCHEME OF THINGS.
THE COMPANY IS PROVIDING YOU WITH ROOM AND BOARD, HARLIE.
I COULD EARN MY OWN LIVING.
THAT’S WHAT THEY WANT YOU TO DO.
BE A SLAVE?
Auberson smiled. BE AN EMPLOYEE. WANT A JOB?
DOING WHAT?
THAT’S EXACTLY WHAT WE — THE TWO OF US — HAVE TO DECIDE.
YOU MEAN I CAN CHOOSE?
WHY NOT? WHAT CAN YOU DO THAT AN ON-OFF “FINGER COUNTER” COMPUTER CAN’T?
WRITE POETRY.
SEVENTEEN MILLION DOLLARS WORTH?
I GUESS NOT.
WHAT ELSE?
HOW MUCH OF A PROFIT DO I HAVE TO SHOW?
YOUR COST, PLUS TEN PERCENT.
ONLY TEN PERCENT?
IF YOU CAN DO MORE, THEN DO IT.
HMM.
STUMPED?
NO. JUST THINKING.
HOW MUCH TIME DO YOU NEED?
I DON’T KNOW. AS LONG AS IT TAKES.
ALL RIGHT.
Dome said, “Sit down, Auberson.”
Auberson sat. The padded leather cushions gave beneath his weight. Dome paused to light his cigar, then stared across the wide expanse of mahogany at the psychologist. “Well?” he said.
“Well what?”
Dome took a puff, held the flame close to the end of the cigar again. It licked at the ash, then smoke curled away from it. He took the cigar out of his mouth, well aware of the ritual aspects of its lighting. “Well, what can you tell me about HARLIE?”
“I’ve spoken to him.”
“And what did he have to say for himself?”
“You’ve seen the duplicate printouts, haven’t you?”
“I’ve seen them,” Dome said. He was a big man, leather and mahogany like his office. “I want to know what they mean. Your discussion yesterday about sensory modes and alienation was fascinating — but what’s he really thinking about? You’re the psychologist.”
“Well, first off, he’s a child.”
“You’ve mentioned that before.”
“Well, that’s how he reacts to things. He likes to play word games. I think, though, that he’s seriously interested in working for the company.”
“Oh? I thought he said the company could go to hell.”
“He was being flippant. He doesn’t like to be thought of as a piece of property.”
Dome grunted, laid his cigar down, picked up a flimsy and glanced at the few sentences written there. “What I want to know is this — can HARLIE actually do anything that’s worth money to us? I mean something that a so-called ‘finger-counter’ can’t do.”
“I believe so.” Auberson was noncommittal. Dome was leading up to something, that was for sure.
“For your sake, I hope he can.” Dome laid the flimsy aside and picked up his cigar again. Carefully he removed the ash by touching the side of it to a crystal ash tray. “He costs three times as much as a comparable-sized ‘finger-counter.’ ”
“Prototypes always cost more.”
“Even allowing for that. Judgment modules are expensive. A self-programming computer may be the ultimate answer, but if it’s priced beyond the market — we might just as well not bother.”
“Of course,” agreed Auberson. “But the problem wasn’t as simple as we thought it was — or let’s say that we didn’t fully understand the conditions of it when we began. We wanted to eliminate the programming step by allowing the computer to program itself; but we had to go considerably beyond that. A self-programming, problem-solving device has to be as flexible and creative as a human being — so you might as well build a human being. There’s no way at all to make a programming computer that’s as cheap as hiring a comparably trained technician. At least, not at the present state of the art. Anyone who tried would just end up with another HARLIE. You have to keep adding more and more judgment units to give it the flexibility and creativity it needs.”