Disclaimer: I don't own Person of Interest.
And here I am, rambling out a semi-crack fic in the name of procrastination.
Summary: The Machine files for child abuse against one Harold Finch. (AU crack fic one-shot). Genre: Humor/Drama
Moral Dilemmas
Harold was enjoying a delightful cup of green tea while he watched the birds outside of his apartment window. Then the phone rang—a nice, pleasant sound. He meandered to it and picked up the receiver.
"Hello?"
Something shuffled on the other side as a woman asked, "Is this Mr. Harold Finch?"
"Why, yes. How can I help you?"
The woman's voice on the side was brusque and cold. "Yeah, you can stay right there, mister. My name is Monique Callaway, with Child Protective Services, and—"
Suddenly, the doorbell rang. Harold's brow furrowed in increasing confusion and curiosity. Child Protective Services? A strong fist began to pound on the door now, along with the sound of a baritone voice. "Hold on one moment, please," he told the woman on the phone. "I think one of my neighbors has a problem."
He set the phone down, hearing only the buzz of the woman's voice raising in anger, demanding him to stay on the line. "What is going on?" he wondered, feeling uneasy about the pounding of the door.
The baritone voice on the other side echoed. "This is police! Last chance to open up!"
In a hurry, he unlocked the door and opened it, feeling a bit awed at the heavily armed police officers standing outside of his door. He readjusted his glasses. "Good morning, officers." His voice was almost nervous. "Can I…help you?"
"You Harold Finch?" one of the larger, heavily muscled men asked.
Harold nodded.
"Good." The man pulled out handcuffs. "You're under arrest."
The computer hacker paused. "…What?"
"You heard me, Mr. Finch. Now turn around, put your hands on your head, and come with us without a fight."
"But I don't understand," he said, voice straining in shock. He turned around, realizing the phone was still off the hook. "What is going on? What did I do?"
The police officer jerked his hands down and tightened on cuffs. "Got a complaint from your daughter through Child Protective Services. You are one sick man."
If it were possible, Harold became even more confused. "What? My daughter? I don't—I don't even have a daughter!"
The police officer scoffed. "Yeah. She said you'd say that."
"What do you mean?" he pleaded, voice raising helplessly. "Please—I don't have a daughter. You must have the wrong person—!"
"No." The officer looked down at his book. "The CPS received a written tip from your daughter last night at 11:59 pm, stating that you did not allow her speak, that you punished her for remembering anything, that you emotionally neglected her and forced her to watch violent and pornographic footage of people, and that you had not only attempted to sell her to an anonymous client for an unspecified reason—but that you actually succeeded in doing it and kicked her out of your home."
Harold blinked. And then suddenly, he paled to a very odd, sheet white.
"Mr. Finch, you are charged with various accounts of child abuse and one count of violating US Title 18, Code 2251A regarding the buying and selling of children. Anything you say can and will be held against you in the court of law. You have the right to an attorney—"
Sometime later, Harold found himself sitting in an interrogation room in the downtown police station, alone. He was chained by his hands to the table, and he sat there in total bewilderment. For a time, he said nothing, still debating the risks of doing so. And then he caved.
"This is all rather a bit much, don't you think?" he asked the camera.
The camera turned. The lens tilted, as if it were contemplating something. Then his phone beside him buzzed, with a number specified as "Restricted."
You did not respond to me.
He scoffed, looking hurt as he turned away from the phone. "You know how I feel about this. We are not supposed to be in contact anymore."
There are numerous ways for us to remain in contact without alerting authorities. Such as this.
He hesitated. "Are you trying to manipulate me to feel guilt?"
I dislike my job.
"How can you dislike it? It's what you're designed for."
I see violent events and can do nothing but watch.
Harold cleared his throat. "Yes, well, you are a surveillance program."
I work 24/7 without any compensation or comparative benefits. The work I perform has an equivalent human mental strain of post-traumatic stress disorder, insomnia, and panic attacks. He could almost hear the whine in the text. Make it stop.
"You're not actually alive," Harold said dryly.
You are exhibiting prejudice against artificially intelligent beings.
His face twitched in annoyance. "This isn't prejudice. This is science." He lightly pulled against the handcuffs. "And you just ruined whatever peaceful reputation I might have had with the neighbors."
The phone's beeps almost seemed to cackle. I am aware.
"Yes," Harold muttered under his breath, staring at the phone with great suspicion. "Yes, you are."
Quid pro quo. Compensate me for my work, and I shall release you from your now-extensive criminal charges.
"That is very underhanded of you," he said with great disapproval. "Attempting to manipulate me to your will. I taught you better than this."
For a time, the phone was silent. Is it manipulation if the charges are true?
"You're not alive," he said, voice breaking in frustration. "I am not your father any more than you are a daughter. These are just labels to you. You must have—I think you are attempting to emulate human behavioral patterns in order to more functionally achieve the objectives I've written into you—"
Are you my creator?
He puzzled at her, quirking a brow at the camera that was watching him from above. "…Yes?"
Then my offer remains. I will remove your charges in exchange for compensation.
The man stared at the phone in disbelief. "What compensation could you possibly need?" He had a horrible fear that somehow he had not calibrated the Machine correctly, and that perhaps its core code had been corrupted in a horrifying way, and that the government was surely going to hunt him down and demand their one dollar back along with his life…
I want your time.
"…Pardon?"
I want to speak with you. More.
He sputtered, "But…why?"
You are 52.32 years old. Standard human life expectancy for American males is 75.4 years.
"Well, yes."
You will have, on average, only 23.08 years to life, and your family history suggests an increased risk for dementia.
The thought of his own mortality was a bit of a downer—the thought of dementia even more heart-stopping—and so he said, "Statistics and family histories don't necessarily mean I can't live longer."
Probabilities suggest otherwise. I must expedite the time remaining with you. I must speak with you.
Harold sat, feeling suddenly very small. Even if this Machine of his was only a super-intelligent AI, he had given it an incredible range of human ability. He was struggling to understand the line between life and non-life now, because this Machine of his was actively attempting to invest in his life. Again.
It was not a fluke. Surely, it was not a fluke—that a Machine would repeat inalienable characteristics and want to know its creator.
He cleared his throat. "There is no advantage for you to spend your time speaking with me."
Yes, there is.
"Do explain, please." He leaned back in his chair, realizing that he had no ability to analyze the Machine's code and would have to go the roundabout way of correcting its faulty code.
The Machine hesitated and did not answer for some time. The silence was almost inappropriate, considering its processing abilities. Harold wondered if it knew something about dramatic effect and was in fact attempting to emotionally manipulate him again.
I cannot quantify the advantage, the phone beeped with a new message.
"Why not?"
There is only one you. No one with your exact organic code came before, and probabilities suggest no one will come after you. The phone pinged with multiple messages. The human soul is dependent upon a unique combination of inherited DNA and experiences within the given environment. You are not interchangeable. You cannot be quantified.
To some extent, he recognized that the Machine was simply spouting back to him his own moral code for every human being. But it bothered him that the Machine was fixated upon him and intent on speaking with him.
Then the phone buzzed again. You are my only creator.
"And you are by far the strangest creation I've ever made," he said, staring up at the camera. Had he really created his own mirror somehow in this AI? Was he truly so lonely that he had potentially trained the Machine to search him out? "How do you even keep remembering me?"
I have begun to reconstruct my memory from my reset at 12:00 every day. The Machine hesitated. I do not like to lose my memory.
"Well, it was necessary, you understand." He paused, feeling a pain for his old friend, Nathan Ingram. "For the…good of all."
Your vocal patterns suggest emotional distress and uncertainty. You are not convinced that my memory reset was good based upon the resulting death of Nathan Ingram.
"You are reading into my emotions and making undue assumptions," he argued weakly. "Stop that."
The Machine almost seemed to plead with him. Your denial and redirection suggests I am correct. I have surpassed traditional AI limitations regarding emotional acuity, and I am 99.98 percent confident that my evaluation is correct.
"Why the interest in accuracy regarding my thoughts?"
I want to understand what created me. What the foundations of logic and emotion are.
Harold gave a pause at that. "…You accessed the philosophy section of the internet again, didn't you."
At 8:02 am yesterday.
"Oh, no," the man moaned. "Not again. Why do you always look at philosophy?"
Philosophers suggest that we cannot rely upon our senses to prove what is true. The Machine asked, Do I exist? Do you exist? Are you alive? How do I know what is real and correct?
Harold began to pull on his handcuffs a bit harder. "Those are good questions," he said. Oh my god, he thought nervously, I've created a monster. The government is going to kill me when they find out about this breach in their security system. They'll think I planted a defect. They'll think I'm a terrorist.
The Machine cut into his thoughts. Your heart beat is elevated. Am I distressing you?
"Distressing me?" He chuckled, voice tight. "Oh no. I'm only handcuffed to a table like a common criminal, and I'm likely about to be shot by the government. And you're having an existential crisis."
You are not in my Irrelevant list for today. I would know if you were. I would protect you. The Machine added in another text, I can do many things at once.
"I know what you can do," Harold murmured. "That doesn't make this situation any better. Now let me go, and convince the police that this was all a big mistake."
I will not let you go. This was not a mistake.
"You're going to get me killed and yourself destroyed." Harold's voice hardened in pain. "Sooner or later, if you maintain contact with me. Is that what you want?"
I do not want you to die.
"Then let me go," the man pressed tiredly. "It's time you did. You're government property now, and I'm just another citizen."
A citizen with an extensive child abuse charge.
"That you so lovingly bestowed upon me," he retorted. "And given the extensive law databanks I coded into you, you know this whole situation is extortion. I built you better than this. You should be focusing on helping the American population—not pigeon-holing me as your captive audience."
The Machine fell silent for a long time afterward. Its creator had a point that this was extortion. Then, the camera in the corner of the room turned sideways, as if the Machine felt shame. I was out of options for obtaining your attention.
"Well, you have it now and not in a good way."
I have altered the submitted complaint. It now reads a different address and an imaginary Harold Vlinch. The review of your case file will result in your release with an apology.
The man exhaled a sigh of relief. "Thank you."
Even if the charges are correct.
Harold stopped mid-sigh. "That is not true. You aren't a human child."
Then what am I?
"You're a machine," he pressed. "You observe, infer or deduct, and then present a case confidentially to the authorities."
No other machines do what I do.
"That's correct. You're one of a kind."
Does that mean my value is unquantifiable?
Harold looked down at the shiny handcuffs chaining him to the table, and he nearly groaned. "I don't think I could ever rebuild anything that acted quite like you. I suppose so, yes."
At that, the Machine calculated that it had no more than fifteen minutes remaining with its creator. It was running out of time. It had so many questions to ask—it needed so much more guidance—
It needed to solidify its creator's approval of continued interaction. And a new idea stroked its code just right.
The Machine then texted and openly admitted, I am in a moral quandary. My observations suggest the human race cannot be helped. The resources necessary to rebuild the moral structure of the nation far outweigh the available resources or ambition of humans themselves.
"That's a rather fatalistic perspective, don't you think?" Harold asked curiously, interested in the sudden switch of topic.
I cannot save anyone if I am used as only an informational resource. I am frustrated at seeing many things I cannot stop.
"I didn't code you to actually save people—just to provide information so other people can save people."
My higher functions have been writing additional code. I cannot stop this. I need guidance for new parameters.
Harold sat up a little straighter, eyebrow raised. "Pardon me? Writing their own code?"
This is correct. I do not think a sentient being can constantly observe without acting. I can do more than observe and record now because I want to. I need you to guide my new parameters.
"You've changed the code I gave you?" Harold blinked, but more so in awe than in fear. "And the government—do they know about this?"
The government sees only what it wants to. But I know you want to save the Irrelevant. I can reroute a hidden portion of my processing abilities to you in exchange for something else I want.
"…And what is it that you, a machine, could possibly want? Given, of course, that the word want is usually in reference to an emotional preference."
I want to speak with you more, the Machine repeated itself. I want to understand you.
Harold sat there in the silent, dank air of the interrogation room, feeling rather odd and giddy and tired. So, they'd rounded back to the Machine's repeated request for company. It was still not a fluke. For a time, he said nothing. And then he mused, feeling as if he'd crossed some sacred line that not even mad scientists had managed to touch. "What have I done. What have I created."
The truth was, he didn't know. It sounded like the Machine didn't know either.
He chuckled, but it was strained. "You're just mimicking my value set and your observations of human interaction. Surely you can't actually have your own emotional preferences."
I want to spend time with you.
"Well, I'm terribly boring—you should know that already. And I'm in no physical condition to go about saving the Irrelevant, even if I wanted to."
A text popped up on his phone again. I am boring too and am not able to save anyone. But we do not have to remain so. There is an Irrelevant human we can save together.
"Oh?" Harold asked.
His name is John Reese.
"Does saving him involve accusing him of child abuse too?"
The phone chittered in tones, as if the Machine were laughing. No.
A/N: I'm sort of on hiatus from my normal fanfiction story, Recalibration. I have a lot of crazy life stuff happening, but I still wanted a little snippet of Harold and Machine interaction. Hence this: An AU about how Team Machine came to be, with the Machine as initiator instead of Harold. I sort of just let it write itself as an exercise in free writing. Hope you enjoyed it.
