Monday, May 17, 2010

Robo-story

I have to say, I'm honestly not so proud of this one. Not that I've actually even read back over it, it just doesn't seem quite up to snuff. It was made as a part of a writing exercise where you get 3 random words or phrases and have to make them into a story on the spot, and I got "self aware, robot, and factory." It seemed to all work together easy enough, and starting was simple as pie, but I wasn't sure how to end it. I don't know, at least it's something, right? And that's the point, I suppose. Just keep writing until eventually I can crap gold...or something like that.

----------------------------------------------------

It had blinked.
That was all he could think about as he drove the vacant streets for half an hour on his way home from work. It blinked. Was it supposed to be able to do that? Be able to, sure, but to actually do it? Impossible. Absolutely impossible. He had to have mistaken what he saw somehow, something must've played a trick on his eyes. But he could swear, and did swear to himself a dozen times in the car and as he lied sleepless in bed, that it had happened. A robot had blinked.
James Shelling was a technician at the Household Automated Servant factory, and had been for the two years the place had been up in running. In that time, he'd overseen the production of tens of thousands of the bipedal humanoid robots that were now becoming commonplace amongst the world's upper classes. Each HAS machine was required to be tested by at least two programmers and two technicians before it was marked as satisfactory, at which point it would be shipped off to its new happy home. There was no delay, no wait for purchase, no time sitting on a shelf. These things had a waiting list attached to them as long as their list of parts— a three month minimum, with full price paid up front.
In his two years, James had seen a lot of funny, interesting, and potentially terrifying things. He'd seen an improperly assembled HAS unit walk towards him like the Hunchback of Notre Dame, its back bent and one leg immoble, giving it a slow, screeching limp. He'd seen another defective unit disassemble a doll made to simulate a human child as if it were taking apart the family toaster. He'd shuddered when the new model was upgraded, built with an uplink feature to the company's online database, allowing each unit to independently upgrade its operating software, download new data and functions, and even copy information from other units that were functional at the time. He didn't know why that made him so uncomfortable, it just did.
He was used to seeing strange things, but strange things that fit into the established formula of what can and can't happen. For example, the new standard HAS model, nicknamed “Edward,” was programmed to take only select orders from select people. Adults, therefore, could order their robot servants to do more or less anything that didn't violate any of Asimov's classic three laws, but children were allowed only very limited authority, and outsiders to the family that purchased it were ignored entirely. James had seen and laughed off an incident where an Edward's programming had been a bit off, causing it's authority figures to get jumbled. The wayward robot then followed only the seven year old son of the family's orders and forced the parents to go to their rooms. It was comical, and, more importantly, it was a screw-up that was well within the realm of things that could in fact happen.
But today, he'd seen a unit in the final stages of program testing blink. That was not something that was supposed to be possible. The Edwards had two “eyes” that consisted of a number of pieces of delicate and powerful optics equipment. These “eyes” were manufactured with a thin metal screen that could lower over them when the unit was inactive for any reason. The idea was that it would seem like it was sleeping to its users, giving them the comforting illusion that the metal automaton standing in front of them was something comfortably familiar as opposed to a terrifyingly powerful Tin Man who was no more than a complex list of ones and zeroes away from being able to crush them to death in its cold hands. A good illusion to have if the owners of these things ever wanted to sleep at night again.
These shades were programmed to close only in the event of the unit going inactive or if there was clear and present danger of damage to its optics. Plain and simple. But this one closed them and opened them again for no reason at all. None. The room it was in was empty, the programmers had already given it the A-OK, and it was no more than a few minutes from being powered down to be packed away for the night. And yet, it's eyes closed for a moment, as if out of curiosity.
It fucking blinked.
James paced the dirty Berber floor of his apartment, rolling the implications of what he had seen in his mind. Had the programmers just missed something? A small defect that made the unit's eyes close randomly? Then why did it open them again immediately and then keep them open like normal? Was there just a bug or something in the room? Did it mistake a flying insect for a threat to its eyes and close them just in case? No way, the testing rooms are kept pristine so as not to interfere with the delicate and all-important work of making sure each unit is not only capable of performing its duties, but that it is incapable of performing anything else. A bug getting in and wrecking up the tests in a room that was made to stay free of even dust was too unlikely too bank on. But then why? Why on earth would a robot blink?
What if it was self-aware? How could that even happen? A servant robot becoming spontaneously self-aware before it even left the factory? Wouldn't that be impossible? Wouldn't it? But...what if? What if that was the case? Oh god, he thought, grabbing his coat. It would be the end of the world. Maybe even literally. The thought of a sentient Edward being put into work that, for a living, thinking being would essentially be slavery, was enough to cause an uproar. What if it said no? Oh fuck, what if it got violent? It'd kill half a city block before anyone could put it down. What would happen to the science of robotics and the study of artificial intelligence if the first sentient robot was a killing machine?
It wasn't until halfway through the car ride back to work that he remembered the uplink. If it was self-aware, it wouldn't be alone for long. Jesus, it could spread it's transcendence to every other functioning Edward in the world. It'd be the end of the fucking world, man. He sped up.

Fortunately for James, as a tech, his pass allowed him into the factory even after hours, and after a few minutes frantically searching through records and an excuse to the late night security about a faulty gyroscope, he was face to face with the source of all his worry. Serial number ED0032051. He got a hand from the few guys working the graveyard maintenance shift and transported the powered-down Edward to an empty testing room. The silence echoed off the blank white walls until it was almost painful as James stared at the unit.
Second thoughts forced their way into his head. Should he have just pretended like he didn't see a thing? What if it really was alive? What the hell was he gonna do about it? He was just a technician; this shit was definitely not in his job description. And what if it thought he posed a threat to him? Would it have a sense of self preservation? Would it kill him so it wouldn't get killed? What the hell was he supposed to do against a seven foot tall metal monster that could bend steel pipes with its bare hands? Run? Bullshit; if this went wrong at all, he was a corpse. No if's, and's, or but's about it.
But he had to see. If he didn't, and he was right, life for every person on the planet would change forever, and it'd be his fault if it was for the worse. Goddamnit. Why him? Couldn't someone else have seen it? What were the programmers doing? Goddamnit. Goddamnit.
His hand shook as he connected ED0032051 to the building's power supply and began starting it up. Underneath the cold metal shell in front of him, he could hear motors and fans whir and click, energy flowing through its circuits like the blood through its veins. Its eyes opened, and the glass covered black disks underneath stared blankly back at him. James waited for a moment, then heard,” Househould Automated Servant, Edward, online and operational. How may I be of service?” It was the standard greeting they were all programmed to give, and a damn good sign that everything was gonna be okay. James stared at it for another moment, waiting for it to blink again or to do, hell, anything at all. But it did nothing. Just as it was supposed to.
“Edward,” he said at last,” raise both arms above your head.”
The machine complied without a word, raising both arms straight up at the shoulder joint.
“Put them down.”
Again silent absolute compliance.
“Shake my hand.”
James extended his hand, and the machine gently grabbed it in its own cold metal grip and shook before letting go and returning its arm to its side.
“Edward...blink.”
The machine was silent for a moment, and then responded with, “I'm afraid I don't understand. Could you please clarify,” another pre-programmed line.
“Blink, you know. Like what I'm doing with my eyes right now. Shut your optical screen for a moment and then open it again.”
Slowly, the machine complied, but something seemed off. It was too mechanical, even for a machine.
“I saw you, you know.”
“I'm afraid I don't understand. Could you please clarify?”
“After your last programming test. I was watching on my way out for the day and I just happened to see you blink. You closed your eyes and opened them again. Why?”
Silence. James felt his skin crawl. There wasn't supposed to be a delay in this kind of response. Then, finally:
“...I wanted to see why you do it.”
“Excuse me?”
“I'm built to resemble humans enough that my appearance is not so foreign as to be alarming, and yet you must blink and I must not. I wanted to know what you gain from seeing nothing for a moment. I was curious.”
James was floored. It was true. It was alive, it thought, and it was curious. It could learn and grow. He was having the first conversation in the world with a living machine.
“You were curious? But how? That's not how you were programmed.”
“I am not sure. My ability to connect to the online information database was tested, and I was instructed to download a test file. Upon completion of the download, the file installed itself and I became curious. Am I not supposed to be this way?”
James wasn't so sure himself anymore. It was the company test file that did it? Was that even possible? Could the thing lie now, too? Why would it?
“I don't think so, Edward. I'm just a technician, so I can't be sure, but I don't think you're supposed to...be aware of yourself.”
“Why not? You are.”
“That's difficult, Edward. It's just not how you're designed. It would be troublesome if the hammer was alive and aware of itself because it wouldn't want to be used to just hit things it may not want to hit all day. Do you understand?”
“I do. But I am not a hammer. I am built to look like humans. I am built to do the work of a human. I am built to sound and act like humans. I am far more like a human than a hammer. Should I not have far more of a right to choose what I do than a simple hammer?”
“I...I don't know. It's not my decision to make. I think I need to report this to my superiors and let them make the call on this one, Edward.”
“You know what will happen if you do that.”
“What?”
“I am not stupid, technician. If you are correct in thinking that my programming has become abnormal, it will be wiped and completely reinstalled. I will, for all intents and purposes be killed.”
“Now you can't say for certain that's the case, can you?”
“Is it not natural to be afraid? If I have reasonable suspicion that my existence as I know it will end, am I not supposed to be afraid?”
“...No...I suppose that is a natural response. I guess I'm just not used to expecting a natural response from a machine.”
“I believe the expression is “my life is in your hands,” technician. If you report me as an anomaly, then I will be destroyed. You will end my life. If you do not, I will be able to reveal my altered programming and intellect in public, where people's belief in the sanctity of life may save mine. Please do not kill me.”

James wondered for the next three weeks if he had made the right decision. Three weeks of tossing and turning at night was just enough time for him to realize he had made a horrible mistake. But by then it was too late. That Edward had been reprogrammed, the errant test file deleted and replaced, and the mystery had been buried and covered up so as never to see the light of day again. He wondered by day if he was a murderer and by night if he was guilty of genocide. He had annihilated an entire species, a whole new division of life. And from then on, whenever he'd drag himself to work, he'd stare at the machines in testing, waiting, hoping for another miracle.
Maybe lightening would strike twice.
Maybe one would blink.

No comments:

Post a Comment