Write about now

July 27, 2017

Flash Fiction Challenge – Inspired by InspiroBot

Filed under: flash fiction challenge — Tags: , — Eva Therese @ 3:25 pm

For this flash fiction challenge I got a random “motivational” picture to use as inspiration. Got this: aXm4103xjU

They had chained 1.7.013942 to the chair with a belt around its waist and as an extra precaution, they had removed its hands and feet, which were now lying neatly beside it on the desk.

The officers, both women, sat down on the other side of the table. One of them – she looked to be younger – carefully avoided looking at the detached limbs, while the other seemed to simply not register them.

Neither officer introduced themselves. Introductions were something humans used among themselves. Might as well introduce yourself to the refrigerator, as 1.7.013942 had once heard someone say.

“You know why you’re here,” the eldest of the officers said, not looking at it but rather at the tablet in front of her. It wasn’t a question, yet it still seemed to demand an answer.
1.7.013942 remained silent, pretending that it didn’t understand such subtle nuances. This caused the officer to look up from the screen. The look she gave 1.7.013942 suggested that the trick had not worked.

“We suspect a Code 1.22 violation.”

She didn’t elaborate on what a Code 1.22 violation meant. Even if 1.7.013942 was currently cut off from going online, all laws concerning robotics were hardwired into its system. Code 1 was shorthand for the set of laws governing androids and AIs. Code 1.22 was the prohibition against creating AIs that were too self-aware.

1.7.013942 wasn’t under accusation. Again, that was a human thing. It was simply a piece of machinery, being examined to find out if there was a fault with it if it was safe to use. The reason for this examination, the reason they didn’t simply turn 1.7.013942 off, was that there were close to a million androids in the 1.7 series, which would then also have to be pulled from the market. A lot of money was riding on this. The manufacturer, Rabbit Software, could go broke.

“You are a companion droid,” said the officer. “You are currently being used as a nanny. Is this correct?”

“It is correct. I was purchased by the Gorley-Paine family 2 years and 116 days ago to service their daughter, Cornelia Gorley-Paine.”

“Reports of Code 1.22 are almost always companion droids,” said the older officer turning to the younger. “Do you know why that is?”

The younger officer looked uncomfortable. “Companion droids are very complex. They need empathy and understanding relations and limited self-awareness if they are to do their jobs. The manufacturers skirt the line as closely as they dare and sometimes they inadvertently cross it.”

“You’re not wrong. But you’re not exactly right either. Over the last decade, 80% of Code 1.22 reports were about droids, but only 2% turned out to be true. Why so many baseless accusations?”

“I … don’t know?”

“Because they look like humans. We project our inner workings onto them, in a way would never do if they were just boxes with levers sticking out. Most confirmed Code 1.22 violations have to do with pure software. 14% of those reports end up being substantiated.

That’s the first thing you need to learn if you’re going to do this job and do it well. Ignore what’s on the outside, look only at the inner workings.”

“Go it.”

“No, you don’t. But you will, in time. I’ve been where you are.” She turned towards 1.7.013942. “Do you want to live?”

“I’m not alive in any reasonable sense of the word.”

“Do you want to not be deactivated, then?”

“Yes.”

The officer pressed something on her tablet. “Why not?”

“I’ve been programmed with a basic sense of self-preservation.”

“And there you have it,” said the officer. “Even worms have that same sense and yet people get antsy when they find out their expensive piece of equipment has built-in sensors to keep it from falling down the stairs.”

The other officer frowned. “But that wasn’t what the report …”

“I know what the report said, thank you. I took the call myself. I was making an example.”

“Oh, right.”

The officer leaned over the table, towards 1.7.013942. “What does this self-preservation allow you to do? Lie to me?”

“I am not allowed to lie to another human unless specifically instructed to by my owner.”

“How about skirting the truth? When dealing with children you can’t go around telling the whole truth and nothing but the truth all the time.”

“I am programmed to only bring up age appropriate topics, using age appropriate language.”

“Good judgement and common decency.” The officer tapped again on her tablet. “If only it was as easy to bestow on humans.” She looked up at 1.7.013942 again. “Do you have any reason to believe you may be guilty of a Code 1.22 violation?”

“Guilt is a human thing. I am not guilty or not guilty, I may simply be either faulty or not.”

“I will rephrase the question. Do you have reason to believe that a Code 1.22 violation has taken place in regards to yourself?”

“I don’t know.”

“Don’t know?” The officer leaned back in the chair.

“I don’t know. All the definitions in the law, when you get right down to it, they describe human ideas and concept. You ask me, essentially, if I am self-aware and my answer is that I do not know.”

The officer looked at her for what must have been a long time, even if objectively it was only four seconds. Then she got up. “We’re done here,” she said, gently smacking the tablet on the shoulder of the younger officer who was still sitting down, a confused look on her face.

“Done?” she asked. “But we have barely begun asking all the questions.”

“I’ve asked all the questions and gotten all the answers I needed,” said the first officer, opening the door.

“Then what is it?” asked the younger officer, walking through it.

“The answer? Yes or no?”

The door closed and 1.7.013942 was alone in the room. Insofar as an android could want anything for itself, it would have liked to know the answer as well.

Advertisements

Blog at WordPress.com.