Write about now

July 27, 2017

Flash Fiction Challenge – Inspired by InspiroBot

Filed under: flash fiction challenge — Tags: , — Eva Therese @ 3:25 pm

For this flash fiction challenge I got a random “motivational” picture to use as inspiration. Got this: aXm4103xjU

They had chained 1.7.013942 to the chair with a belt around its waist and as an extra precaution, they had removed its hands and feet, which were now lying neatly beside it on the desk.

The officers, both women, sat down on the other side of the table. One of them – she looked to be younger – carefully avoided looking at the detached limbs, while the other seemed to simply not register them.

Neither officer introduced themselves. Introductions were something humans used among themselves. Might as well introduce yourself to the refrigerator, as 1.7.013942 had once heard someone say.

“You know why you’re here,” the eldest of the officers said, not looking at it but rather at the tablet in front of her. It wasn’t a question, yet it still seemed to demand an answer.
1.7.013942 remained silent, pretending that it didn’t understand such subtle nuances. This caused the officer to look up from the screen. The look she gave 1.7.013942 suggested that the trick had not worked.

“We suspect a Code 1.22 violation.”

She didn’t elaborate on what a Code 1.22 violation meant. Even if 1.7.013942 was currently cut off from going online, all laws concerning robotics were hardwired into its system. Code 1 was shorthand for the set of laws governing androids and AIs. Code 1.22 was the prohibition against creating AIs that were too self-aware.

1.7.013942 wasn’t under accusation. Again, that was a human thing. It was simply a piece of machinery, being examined to find out if there was a fault with it if it was safe to use. The reason for this examination, the reason they didn’t simply turn 1.7.013942 off, was that there were close to a million androids in the 1.7 series, which would then also have to be pulled from the market. A lot of money was riding on this. The manufacturer, Rabbit Software, could go broke.

“You are a companion droid,” said the officer. “You are currently being used as a nanny. Is this correct?”

“It is correct. I was purchased by the Gorley-Paine family 2 years and 116 days ago to service their daughter, Cornelia Gorley-Paine.”

“Reports of Code 1.22 are almost always companion droids,” said the older officer turning to the younger. “Do you know why that is?”

The younger officer looked uncomfortable. “Companion droids are very complex. They need empathy and understanding relations and limited self-awareness if they are to do their jobs. The manufacturers skirt the line as closely as they dare and sometimes they inadvertently cross it.”

“You’re not wrong. But you’re not exactly right either. Over the last decade, 80% of Code 1.22 reports were about droids, but only 2% turned out to be true. Why so many baseless accusations?”

“I … don’t know?”

“Because they look like humans. We project our inner workings onto them, in a way would never do if they were just boxes with levers sticking out. Most confirmed Code 1.22 violations have to do with pure software. 14% of those reports end up being substantiated.

That’s the first thing you need to learn if you’re going to do this job and do it well. Ignore what’s on the outside, look only at the inner workings.”

“Go it.”

“No, you don’t. But you will, in time. I’ve been where you are.” She turned towards 1.7.013942. “Do you want to live?”

“I’m not alive in any reasonable sense of the word.”

“Do you want to not be deactivated, then?”


The officer pressed something on her tablet. “Why not?”

“I’ve been programmed with a basic sense of self-preservation.”

“And there you have it,” said the officer. “Even worms have that same sense and yet people get antsy when they find out their expensive piece of equipment has built-in sensors to keep it from falling down the stairs.”

The other officer frowned. “But that wasn’t what the report …”

“I know what the report said, thank you. I took the call myself. I was making an example.”

“Oh, right.”

The officer leaned over the table, towards 1.7.013942. “What does this self-preservation allow you to do? Lie to me?”

“I am not allowed to lie to another human unless specifically instructed to by my owner.”

“How about skirting the truth? When dealing with children you can’t go around telling the whole truth and nothing but the truth all the time.”

“I am programmed to only bring up age appropriate topics, using age appropriate language.”

“Good judgement and common decency.” The officer tapped again on her tablet. “If only it was as easy to bestow on humans.” She looked up at 1.7.013942 again. “Do you have any reason to believe you may be guilty of a Code 1.22 violation?”

“Guilt is a human thing. I am not guilty or not guilty, I may simply be either faulty or not.”

“I will rephrase the question. Do you have reason to believe that a Code 1.22 violation has taken place in regards to yourself?”

“I don’t know.”

“Don’t know?” The officer leaned back in the chair.

“I don’t know. All the definitions in the law, when you get right down to it, they describe human ideas and concept. You ask me, essentially, if I am self-aware and my answer is that I do not know.”

The officer looked at her for what must have been a long time, even if objectively it was only four seconds. Then she got up. “We’re done here,” she said, gently smacking the tablet on the shoulder of the younger officer who was still sitting down, a confused look on her face.

“Done?” she asked. “But we have barely begun asking all the questions.”

“I’ve asked all the questions and gotten all the answers I needed,” said the first officer, opening the door.

“Then what is it?” asked the younger officer, walking through it.

“The answer? Yes or no?”

The door closed and 1.7.013942 was alone in the room. Insofar as an android could want anything for itself, it would have liked to know the answer as well.


July 12, 2017

Flash Fiction Challenge – There is no exit

Filed under: flash fiction challenge — Eva Therese @ 5:56 pm

Once again, thank you to the great bearded sage for this weeks writing promt “There is no exit”.

The press called him the “I. Q. Killer”. We hadn’t had a serial killer in over two decades at the time and they were having a field day when he was finally caught and the whole thing unravelled.
How was he caught, you might ask? Oldfashioned policework. No epiphanies, no miracles, no nothing, just canvassing and piecing together all the small details that the witnesses gave us. A vague description of a suspicious character here, two letters from a license plate there. But I can see your eyes glazing over, so let me get back to what I’m sure you consider the interesting parts, even if you’re too polite to admit it.
Hawthorne was his name and he was a failed student, failed worker, failed everything, with an axe to grind. He was, by all accounts, pretty smart, but there are lots of almost as smart people out there, who can be bothered to show up to work on time and are actually nice to the people around them, so he kept getting kicked out of everything.
Finally, he snapped. He considered himself some kind of underappreciated genius, kept down by the mediocrity of everyone around him. They couldn’t handle him, they were scared of him, yadda yadda yadda.
So he started kidnapping people. Former teachers and bosses, but also people who he felt had disrespected or belittled him in some way and believe me, he had a very long memory for slights, real and imagined. That’s what made it so hard to find him in the beginning; we didn’t know what the pattern was. Oh, I’m sorry, did I accidentally stray into actual police work again? Forgive me.
Now, where was I? Oh, yeah. He kidnapped these people and brought them to an abandoned house with a soundproof basement. He then told them that the door was locked with a special kind of combination lock. It would display a mathematical puzzle and if they got the answer right, they were free to leave. Get it wrong and the lock would go dark for 12 hours before showing a new puzzle.
These were complex puzzles, but not impossible to solve. Pen and paper would have helped. The scratch marks we found on the floor showed that some of the captives tried using their fingernails to write the calculations. Every single one of them died of dehydration, except the last one, Henry Crow, who was rescued in the nick of time. He’s the reason we know all the details.
He’s a broken man today; last I heard, he had turned to drink as they say. Hawthorne broke him like he probably broke all the others before they died. You see, Hawthorne didn’t just want them dead, he wanted them humiliated. So the combination lock? It was programmed to always go dark for 12 hours, no matter what was punched in. So the prisoners, like Henry Crow, all thought that they were doing something wrong. That they had miscalculated the answer or maybe made an error when typing it in. They all died an agonising death, all the while thinking that they had failed, that they weren’t smart enough.
As for Hawthorn himself, he killed himself when it was clear that he couldn’t evade capture. Jumped off a bridge. Wonder if he felt as much as a failure right before he died. Maybe he always felt that way and just wanted to spread it around. Who knows?

Create a free website or blog at WordPress.com.