Unwise Child by Randall Garrett (early reader chapter books .TXT) đ
- Author: Randall Garrett
Book online «Unwise Child by Randall Garrett (early reader chapter books .TXT) đ». Author Randall Garrett
Mike grinned to himself without letting it show on his face. The skipper was letting the boot ensign redeem himself after the faux pas heâd made.
Vaneski started to stand up, but Quill made a slight motion with his hand and the boy relaxed.
âItâs only a guess, sir,â he said, âbut I think itâs because the robot knows too much.â
Quill and the others looked blank, but Mike narrowed his eyes imperceptibly. Vaneski was practically echoing Mikeâs own deductions.
âI meanâwell, look, sir,â Vaneski went on, a little flustered, âthey started to build that thing ten years ago. Eight years ago they started teaching it. Evidently they didnât see any reason for building it off Earth then. What I mean is, something mustâve happened since then to make them decide to take it off Earth. If theyâve spent all this much [79] money to get it away, that must mean that itâs dangerous somehow.â
âIf thatâs the case,â said Captain Quill, âwhy donât they just shut the thing off?â
âWellââ Vaneski spread his hands. âI think itâs for the same reason. It knows too much, and they donât want to destroy that knowledge.â
âDo you have any idea what that knowledge might be?â Mike the Angel asked.
âNo, sir, I donât. But whatever it is, itâs dangerous as hell.â
The briefing for the officers and men of the William Branchellâthe Brainchildâwas held in a lecture room at the laboratories of the Computer Corporation of Earthâs big Antarctic base.
Captain Quill spoke first, warning everyone that the project was secret and asking them to pay the strictest attention to what Dr. Morris Fitzhugh had to say.
Then Fitzhugh got up, his face ridged with nervousness. He assumed the air of a university professor, launching himself into his speech as though he were anxious to get through it in a given time without finishing too early.
âIâm sure youâre all familiar with the situation,â he said, as though apologizing to everyone for telling them something they already knewâthe apology of the learned man who doesnât want anyone to think heâs being overly proud of his learning.
âI think, however, we can all get a better picture if we begin at the beginning and work our way up to the present time.
âThe original problem was to build a computer that [80] could learn by itself. An ordinary computer can be forcibly taughtâthat is, a technician can make changes in the circuits which will make the robot do something differently from the way it was done before, or even make it do something new.
âBut what we wanted was a computer that could learn by itself, a computer that could make the appropriate changes in its own circuits without outside physical manipulation.
âItâs really not as difficult as it sounds. Youâve all seen autoscribers, which can translate spoken words into printed symbols. An autoscriber is simply a machine which does what you tell it toâliterally. Now, suppose a second computer is connected intimately with the first in such a manner that the second can, on order, change the circuits of the first. Then, all that is needed is....â
Mike looked around him while the roboticist went on. The men were looking pretty bored. Theyâd come to get a briefing on the reason for the trip, and all they were getting was a lecture on robotics.
Mike himself wasnât so much interested in the whys and wherefores of the trip; he was wondering why it was necessary to tell anyoneâeven the crew. Why not just pack Snookums up, take him to wherever he was going, and say nothing about it?
Why explain it to the crew?
âThus,â continued Fitzhugh, âit became necessary to incorporate into the brain a physical analogue of Lagerglockeâs Principle: âLearning is a result of an inelastic collision.â
âI wonât give it to you symbolically, but the idea is simply that an organism learns only if it does not completely recover from the effects of an outside force imposed upon it.[81] If it recovers completely, itâs just as it was before. Consequently, it hasnât learned anything. The organism must change.â
He rubbed the bridge of his nose and looked out over the faces of the men before him. A faint smile came over his wrinkled features.
âSome of you, I know, are wondering why I am boring you with this long recital. Believe me, itâs necessary. I want all of you to understand that the machine you will have to take care of is not just an ordinary computer. Every man here has had experience with machinery, from the very simplest to the relatively complex. You know that you have to be careful of the kind of informationâthe kind of external forceâyou give a machine.
âIf you aim a spaceship at Mars, for instance, and tell it to go through the planet, it might try to obey, but youâd lose the machine in the process.â
A ripple of laughter went through the men. They were a little more relaxed now, and Fitzhugh had regained their attention.
âAnd you must admit,â Fitzhugh added, âa spaceship which was given that sort of information might be dangerous.â
This time the laughter was even louder.
âWell, then,â the roboticist continued, âif a mechanism is capable of learning, how do you keep it from becoming dangerous or destroying itself?
âThat was the problem that faced us when we built Snookums.
âSo we decided to apply the famous Three Laws of Robotics propounded over a century ago by a brilliant American biochemist and philosopher.
[82] âHere they are:
ââOne: A robot may not injure a human being, nor, through inaction, allow a human being to come to harm.â
ââTwo: A robot must obey the orders given it by human beings except where such orders would conflict with the First Law.â
ââThree: A robot must protect its own existence as long as such protection does not conflict with the First or Second Law.ââ
Fitzhugh paused to let his words sink in, then: âThose are the ideal laws, of course. Even their propounder pointed out that they would be extremely difficult to put into practice. A robot is a logical machine, but it becomes somewhat of a problem even to define a human being. Is a five-year-old competent to give orders to a robot?
âIf you define him as a human being, then he can give orders that might wreck an expensive machine. On the other hand, if you donât define the five-year-old as human, then the robot is under no compulsion to refrain from harming the child.â
He began delving into his pockets for smoking materials as he went on.
âWe took the easy way out. We solved that problem by keeping Snookums isolated. He has never met any animal except adult human beings. It would take an awful lot of explaining to make him understand the difference between, say, a chimpanzee and a man. Why should a hairy pelt and a relatively low intelligence make a chimp non-human? After all, some men are pretty hairy, and some are moronic.
âPresent company excepted.â
More laughter. Mikeâs opinion of Fitzhugh was beginning [83] to go up. The man knew when to break pedantry with humor.
âFinally,â Fitzhugh said, when the laughter had subsided, âwe must ask what is meant by âprotecting his own existence.â Frankly, weâve been driven frantic by that one. The little humanoid, caterpillar-track mechanism that we all tend to think of as Snookums isnât really Snookums, any more than a human being is a hand or an eye. Snookums wouldnât actually be threatening his own existence unless his brainânow in the hold of the William Branchellâis destroyed.â
As Dr. Fitzhugh continued, Mike the Angel listened with about half an ear. His attentionâand the attention of every man in the placeâhad been distracted by the entrance of Leda Crannon. She stepped in through a side door, walked over to Dr. Fitzhugh, and whispered something in his ear. He nodded, and she left again.
Fitzhugh, when he resumed his speech, was rather more hurried in his delivery.
âThe whole thing can be summed up rather quickly.
âPoint One: Snookumsâ brain contains the information that eight years of hard work have laboriously put into it. That information is more valuable than the whole cost of the William Branchell; itâs worth billions. So the robot canât be disassembled, or the information would be lost.
âPoint Two: Snookumsâ mind is a strictly logical one, but it is operating in a more than logical universe. Consequently, it is unstable.
âPoint Three: Snookums was built to conduct his own experiments. To forbid him to do that would be similar to beating a child for acting like a child; it would do serious harm to the mind. In Snookumsâ case, the randomity of [84] the brain would exceed optimum, and the robot would become insane.
âPoint Four: Emotion is not logical. Snookums canât handle it, except in a very limited way.â
Fitzhugh had been making his points by tapping them off on his fingers with the stem of his unlighted pipe. Now he shoved the pipe back in his pocket and clasped his hands behind his back.
âIt all adds up to this: Snookums must be allowed the freedom of the ship. At the same time, every one of us must be careful not to ... to push the wrong buttons, as it were.
âSo here are a few donâts. Donât get angry with Snookums. That would be as silly as getting sore at a phonograph because it was playing music you didnât happen to like.
âDonât lie to Snookums. If your lies donât fit in with what he knows to be trueâand they wonât, believe meâhe will reject the data. But it would confuse him, because he knows that humans donât lie.
âIf Snookums asks you for data, qualify itâeven if you know it to be true. Say: âThere may be an error in my knowledge of this data, but to the best of my knowledge....â
âThen go ahead and tell him.
âBut if you absolutely donât know the answer, tell him so. Say: âI donât have that data, Snookums.â
âDonât, unless you are....â
He went on, but it was obvious that the officers and crew of the William Branchell werenât paying the attention they should. Every one of them was thinking dark gray thoughts. It was bad enough that they had to take out a ship like the Brainchild, untested and jerry-built as she was. Was it necessary to have an eight-hundred-pound, moron-genius child-machine running loose, too?
[85] Evidently, it was.
âTo wind it up,â Fitzhugh said, âI imagine you are wondering why itâs necessary to take Snookums off Earth. I can only tell you this: Snookums knows too much about nuclear energy.â
Mike the Angel smiled grimly to himself. Ensign Vaneski had been right; Snookums was dangerousânot only to individuals, but to the whole planet.
Snookums, too, was a juvenile delinquent.
[86]
10The Brainchild lifted from Antarctica at exactly 2100 hours, Greenwich time. For three days the officers and men of the ship had worked as though they were the robots instead of their passengerâor cargo, depending on your point of view.
Supplies were loaded, and the great engine-generators checked and rechecked. The ship was ready to go less than two hours before take-off time.
The last passenger aboard was Snookums, although, in a more proper sense, he had always been aboard. The little robot rolled up to the elevator on his treads and was lifted into the body of the ship. Miss Crannon was waiting for him at the air lock, and Mike the Angel was standing by. Not that he had any particular interest in watching Snookums come aboard, but he did have a definite interest in Leda Crannon.
âHello, honey,â said Miss Crannon as Snookums rolled into the air lock. âReady for your ride?â
âYes, Leda,â said Snookums in his contralto voice. He rolled up to her and took her hand. âWhere is my room?â
âCome along; Iâll show you in a minute. Do you remember Commander Gabriel?â
[87] Snookums swiveled his head and regarded Mike.
âOh yes. He tried to help me.â
âDid you need help?â Mike growled in spite of himself.
âYes. For my experiment. And you offered help. That was very nice. Leda says it is nice to help people.â
Mike the Angel carefully refrained from asking Snookums if he thought he was people. For all Mike knew, he did.
Mike followed Snookums and Leda Crannon down the companionway.
âWhat did you do today, honey?â asked Leda.
âMostly I answered questions for Dr. Fitzhugh,â said Snookums. âHe asked me thirty-eight questions. He said I was a great help. Iâm nice, too.â
âSure you are, darling,â said Miss Crannon.
âYe gods,â muttered Mike the Angel.
âWhatâs the trouble, Commander?â the girl asked, widening her blue eyes.
âNothing,â said Mike the Angel, looking at her innocently with eyes that were equally blue. âNot a single solitary thing. Snookums is a sweet little tyke, isnât he?â
Leda Crannon gave him a glorious smile. âI think so. And a lot
Comments (0)