The Terminal Man Read online

Page 10

Page 10

 

  "Any smells? Strange sensations?" As he asked, McPherson looked at the EEG scanner above the bed. It was still reading normal alpha patterns, without any suggestion of seizure activity.

  "No. Nothing like that. "

  "But you feel as if you might explode?" He thought: Ross should really be asking these questions.

  "Sort of," Benson said. "In the coming war, we may all explode. "

  "How do you mean?"

  "You look annoyed," Benson said.

  "Im not, just puzzled. How do you mean, in the coming war?"

  "In the coming war between men and machines. The human brain is obsolete, you see. "

  That was a new thought. McPherson hadnt heard it from

  Benson before. He stared at him, lying in the bed, his head and shoulders heavily bandaged. It made the upper part of his body and his head appear thick, gross, oversized.

  "Yes," Benson said. "The human brain has gone as far as it is going to go. Its exhausted, so it spawned the next generation of intelligent forms. They will- Why am I so tired?" He closed his eyes again.

  "Youre exhausted from the operation. "

  "A minor procedure," he said, and smiled with his eyes closed. A moment later he was snoring.

  McPherson remained by the bed for a moment, then turned to the window and watched the sun set over the Pacific. Benson had a nice room; you could see a bit of the ocean between the high-rise apartments in Santa Monica. He remained there for several minutes. Benson did not wake. Finally, McPherson went out to the nurses station to write his note in the chart.

  "Patient alert, responsive, oriented times three. " He paused after writing that. He didnt really know if Benson was oriented to person, place, and time; he hadnt checked specifically. But he was clear and responsive, and McPherson let it go. "Flow of ideas orderly and clear, but patient retains machine imagery of pre-operative state. It is too early to be certain, but it appears that early predictions were correct that the operation would not alter his mentation between seizures. "

  Signed, "Roger A. McPherson, M. D. "

  He stared at it for a moment, then closed the chart and replaced it on the shelf. It was a good note, cool, direct, holding out no false anticipations. The chart was a legal document, after all, and it could be called into court. McPherson didnt expect to see Bensons chart in court, but you couldnt be too careful. He believed very strongly in appearances - and he felt it was his job to do so.

  The head of any large scientific laboratory performed a political function. You might deny it; you might dislike it. But it was nonetheless true, a necessary part of the job.

  You had to keep all the people in the lab happy as they worked together. The more prima donnas you had, the tougher the job was, as pure politics.

  You had to get your lab funded from outside sources, and that was also pure politics. Particularly if you were working in a delicate area, as the NPS was. McPherson had long since evolved the horseradish-peroxidase principle of grant applications. It was simple enough: when you applied for money, you announced that the money would be spent to find the enzyme horseradish peroxidase, which could lead to a cure for cancer. You would easily get sixty thousand dollars for that project - although you couldnt get sixty cents for mind control.

  He looked at the row of charts on the shelf, a row of unfamiliar names, into which BENSON, H. F. 710 merged indistinguishably. In one sense, he thought, Benson was correct - he was a walking time bomb. A man treated with mind-control technology was subject to all sorts of irrational public prejudice. "Heart control" in the form of cardiac pacemakers was considered a wonderful invention; "kidney control" through drugs was a blessing. But "mind control" was evil, a disaster- even though the NPS control work was directly analogous to control work with other organs. Even the technology was similar: the atomic pacemaker they were using had been developed first for heart work.

  But the prejudice remained. And Benson thought of himself as a ticking time bomb. McPherson sighed, took out the chart again, and flipped to the section containing doctors orders. Both Ellis and Morris had written post-op care orders. McPherson added: "After interfacing tomorrow a. m. , begin thorazine. "

  He looked at the note, then decided the nurses wouldnt understand interfacing. He scratched it out and wrote: "After noon tomorrow, begin thorazine. "

  As he left the floor, he thought that he would rest more easily once Benson was on thorazine. Perhaps they couldnt defuse the time bomb - but they could certainly drop it into a bucket of cold water.

  7

  Late at night, in Telecomp, Gerhard stared anxiously at the computer console. He typed in more instructions, then walked to a print-out typewriter and began reviewing the long sheaf of green-striped sheets. He scanned them quickly, looking for the error he knew was there in the programmed instructions.

  The computer itself never made a mistake. Gerhard had used computers for nearly ten years - different computers, different places - and he had never seen one make a mistake. Of course, mistakes occurred all the time, but they were always in the program, not in the machine. Sometimes that infallibility was hard to accept. For one thing, it didnt fit with ones view of the rest of the world, where machines were always making mistakes - fuses blowing, stereos breaking down, ovens overheating, cars refusing to start. Modern man expected machines to make their fair share of errors.

  But computers were different, and working with them could be a humiliating experience. They were never wrong. It was as simple as that. Even when it took weeks to find the source of some problem, even when the program was checked a dozen times by as many different people, even when the whole staff was slowly coming to the conclusion that for once, the computer circuitry had fouled up - it always turned out, in the end, to be a human error of some kind. Always.

  Richards came in, shrugging off a sport coat, and poured himself a cup of coffee. "Hows it going?"

  Gerhard shook his head. "Im having trouble with George. "

  "Again? Shit. " Richards looked at the console. "Hows

  Martha?"

  "Marthas fine, I think. Its just George. "

  "Which George is it?"

  "Saint George," Gerhard said. "Really a bitch. "

  Richards sipped his coffee and sat down at the console.

  "Mind if I try it?"

  "Sure," Gerhard said.

  Richards began flicking buttons. He called up the program for Saint George. Then he called up the program for Martha. Then he pushed the interaction button.

  Richards and Gerhard hadnt devised these programs; they were modified from several existing computer programs developed at other universities. But the basic idea was the same - to create a computer program that would make the computer act emotionally, like people. It was logical to designate the programs with names like George and Martha. There was a precedent for that: Eliza in Boston, and Aldous in England.

  George and Martha were essentially the same program with slight differences. The original George was programmed to be neutral in his response to stimuli. Then Martha was created. Martha was a little bitchy; Martha disliked most things. Finally, another George was formulated, a very loving George, who was referred to as Saint George.

  Each program could respond with three emotional states - love, fear, and anger. Each could produce three actions - approach, withdrawal, and attack. All this was, of course, highly abstract. It was carried out in terms of numbers. For example, the original George was neutral to most numbers, but he disliked the number 751. He was programmed to dislike it. And by extension he disliked similar numbers - 743, 772, and so on. He much preferred numbers such as 404, 133, and 918. If you punched in one of these numbers, George responded with numbers signifying love and approach. If you punched in 707, George withdrew. If you punched in 750, George angrily attacked - according to the numbers he printed out.

  The NPS staff had played with the programs in this way for a long time. Then they had incorporated program modif
ications to allow for "talk" from the computer. The numbers were translated into sentences. This was amusing, and revealing. The actual interaction was referred to as "the Christmas game" because most of it was conducted as giving and receiving gifts - objects that had an assigned or learned emotional value, just as the numbers did.

  Normal George interacting with Martha would eventually win her over, and her bitchiness would recede into the background.

  But Saint George had a much worse effect on her. His loving acceptance drove her up the wall. That is, if things were working normally. Richards watched as the print-out flashed across the screen.

  HELLO IM SAINT GEORGE

  HELLO

  WHAT IS YOUR NAME?

  MARTHA.

  YOU DONT SAY MUCH, MARTHA.

  NO

  BUT I LIKE YOU.

  WHY I DONT LIKE YOU.

  I UNDERSTAND THAT. HOW CAN I HELP?

  "It isnt printing periods consistently," Richards said.

  "When did that start?"

  "Thats nothing," Gerhard said. "Just keep watching. "

  HELP?

  YES, I WOULD LIKE TO HELP YOU.

  HAVE A CAT.

  Richards winced. Cats were programmed to be something that both George and Martha disliked. Giving George a cat was a very hostile move.

  THANK YOU

  HAVE ANOTHER CAT.

  THANK YOU YOU ARE VERY KIND.

  HAVE ANOTHER CAT.

  NO THANK YOU.

  DONT YOU LIKE CATS?

  YES BUT I HAVE ENOUGH CATS.

  HAVE A GORILLA.

  "Really nasty," Richards said. "What are the other interactions like?"

  "Pretty much the same," Gerhard said.

  "As bad as this?"

  "Pretty much. "

  They were accustomed now to the idea that when the computers interacted, the results were unpredictable. The general form of interaction could be guessed, but the specific results were uncertain. It was like a computer playing checkers - it was probably going to win, but it would win a different way each time, depending on what the opponent did.

  I DONT WANT A GORILLA. DONT YOU LIKE GORILLAS? NO THANK YOU.

  YOU ARE TOO POLITE.

  I AM SORRY IF I OFFEND YOU IN ANY WAY.

  YOU TALK TOO MUCH.

  I AM SORRY.

  HAVE AN EGGPLANT.

  Richards watched with interest. Saint George had no background for eggplants. But he disliked cucumbers and bananas.

  NO THANK YOU.

  DONT YOU LIKE EGGPLANT?

  NOT VERY MUCH.

  HERE HAVE ANOTHER ONE.

  NO THANK YOU.

  I WANT YOU TO HAE IT.

  NO THANK YOU.

  GO ON AND TAKE IT.

  NO THANK YOU.

  I INSIST.

  NO THANK YOU.

  "Whats happened to Saint George?" Richards asked. "His responses are too much the same. "

  "Thats whats bothering me. "

  "Whats he cycling through on the program?"

  "I was looking for it when you came in. "

  I INSIST THAT YOU HAVE A CUCUMBER.

  I REFUSE.

  "George!" Richards said, almost without thinking.

  THEN HAVE A BANANA.

  NO.

  "George is breaking down," Richards said. "Hes not a saint any more. "

  THEN HAVE BOTH A BANANA AND A CUCUMBER.

  NO THANK YOU.

  I INSIST.

  GO TO HELL I WILL KILL YOU:::::::::

  :::::::::::::::::::::::

  :::::::::::::::::::::::

  The screen was filled with white dots. "What does that mean, unprintable response?" Richards said.

  "I dont know. Ive never seen it before tonight. "

  "How many times has this program been run?" Richards asked.

  "One hundred and ten, against Martha. "

  "Any learning erasures?"

  "No. "

  "Ill be goddamned," Richards said. "Hes getting to be a short-tempered saint. " He grinned. "We can write this one up. "

  Gerhard nodded and went back to the print-out. In theory, what was happening was not puzzling. Both George and Martha were programmed to learn from experience. Like the checkers-playing programs - where the machine got better each time it played a game - this program was established so that the machine would "learn" new responses to things. After one hundred and ten sets of experience, Saint George had abruptly stopped being a saint. He was learning not to be a saint around Martha - even though he had been programmed for saintliness.

  "I know just how he feels," Richards said, and switched the machine off. Then he joined Gerhard, looking for the programming error that had made it all possible.

  Thursday, March 11, 1971: Interfacing

  1

  Janet Ross sat in the empty room and glanced at the wall clock. It was 9 a. m. She looked down at the desk in front of her, which was bare except for a vase of flowers and a notepad. She looked at the chair opposite her. Then, aloud, she said, "Howre we doing?"

  There was a mechanical click and Gerhards voice came through the speaker mounted in the ceiling. "We need a few minutes for the sound levels. The light is okay. You want to talk a minute?"

  She nodded, and glanced over her shoulder at the one-way mirror behind her. She saw only her reflection, but she knew Gerhard, with his equipment, was behind, watching her. "You sound tired," she said.

  "Trouble with Saint George last night," Gerhard said.

  "Im tired, too," she said. "I was having trouble with somebody who isnt a saint. " She laughed. She was just talking so they could get a sound level for the room; she hadnt really paid attention to what she was saying. But it was true: Arthur was no saint. He was also no great discovery, though shed thought he might be a few weeks ago when she first met him. She had been, in fact, a little infatuated with him. ("Infatuated? Hmm? Is that what youd call it?" She could hear Dr. Ramos now. ) Arthur had been born handsome and wealthy. He had a yellow Ferrari, a lot of dash, and a lot of charm. She was able to feel feminine and frivolous around him. He did madcap, dashing things like flying her to Mexico City for dinner because he knew a little restaurant where they made the best tacos in the world. She knew it was all silly, but she enjoyed it. And in a way she was relieved - she never had to talk about medicine, or the hospital, or psychiatry. Arthur wasnt interested in any of those things; he was interested in her as a woman. ("Not as a sex object?" Damn Dr. Ramos. )