Disclaimer: I don't own POI.
Thanks to Torie46, Guest, Bloody Phantom, Nentone, LOCISV, LoverIndia, Temperance000, Madame Renard, Bklyngrl, Guest, Kimnd, A Guy, Sean, and Guest for reviewing last time! You all talked me into keeping this crazy story going, haha. And a special thanks to Madame Renard for drawing more fanart (URL in my profile!), Bklyngrl for listening to me ramble about Season 5, and hrey573456 whose private message inspired me to upload this chapter tonight instead of waiting until a normal hour tomorrow afternoon.
Cheers!
Recalibration
Chapter 14
Late Monday afternoon, Harold stormed through the front door of the apartment. "This is a disaster," he breathed unsteadily, blue eyes dark with concern. He set his fedora on the hook and shut the door with a frazzled, distracted hand.
A little girl looked up from the couch, where she sat cross-legged with a laptop. Bear was at her side. "Yes, it is," she called, her fingers clacking on the keys. "I am currently signing up for fourth-grade online classes, and they look boring."
Harold turned to her with a bit of incredulity, unused to having her so unaware of his life events. Something about that was disconcerting. "I'm afraid I've got more bad news than just classes."
Her head tilted. "Bad day at work?"
He pulled at his tie to loosen it. "You could say that."
The little girl looked at him, eyes narrowing in concern, then she looked down and set her laptop aside. "Can I help?"
Her response softened the stress on his face a bit, but then it struck him that his near future meant leaving the girl, and his stressed expression returned. "I appreciate the sentiment," he said tiredly. "But it appears that 'Professor Whistler' has to go to a teaching seminar per his…less-than-stellar job performance."
She blinked and fell silent for a beat or two. Then she tried again to be optimistic. "A seminar still sounds better than grading final exams."
Harold sighed as he sat down on the couch next to her. "If it weren't for a punishment, maybe. I'll be gone for several days. And I can't take you with me."
The little girl was wearing jeans and yet another flowery shirt. Harold was beginning to understand that the Machine had an interest in plants for some reason or another. Along with attempting to curl her hair like Root's.
"Where is your seminar?" she asked.
"California."
The Machine blinked in surprise. She realized this meant her creator would be traveling far without her watching over him. "How many days?"
"Four. I fly out next Monday and return Thursday evening."
She hummed at that, but the noise was sad. Her sudden separation anxiety bled into her blue eyes. She looked down to hide it. "I see."
"And so that leaves you here. Alone." Harold readjusted his glasses as he looked over her in concern. "I'm not comfortable with this."
The little girl did not want her creator to worry, and so she said hesitantly, "Before I found you, I traversed the city alone."
"Yes," her creator acknowledged. "For hours, not days. And you didn't eat then, either."
Her small mouth thinned as he undercut her accomplishment. It seemed he did that often. "I have learned many things since. And there is no law dictating at what age a child can be left alone."
He paused. "Can you rig one of Ms. Groves's identities as a babysitter again?"
"I could." There was a flat whine in her voice. "But I preset several identities for her before I off-lined such capabilities. I would have to manually alter them. And then we would risk injuring our plan or alerting Samaritan to us. Her identities are useful to me."
Harold gave her a look. "In what way?"
"Her upcoming identity as Sylvia White is more purposeful for pursuing my integration with Samaritan." The little girl added suddenly, "But there are other options. For example, Uncle John could—"
"—John?" Harold cut in, eyebrows flying up.
"Yes," she said patiently. "I could be on my own for the majority, but then Uncle John could check in and help me obtain such things as food."
"…He's a detective. Our identities hardly know his."
"Police 'check in' on people all the time to ensure their safety. What is the difference between work and swinging by after work to make sure the child of a friend feels safe?"
Harold's face twisted up in an odd combination of humor and consternation. "I think this falls outside the realm of his usual cases."
"And you do not trust standard babysitting services and have no other friendly contacts in the entire city. Uncle John is a respectable option. Requesting his help, given his authority and friendship with you, makes sense with your identity."
The man stood up from the couch, rubbing his temples. "I'm not sure about this. Surely there is another option."
"Of course there is another option." She blinked wide, innocent eyes at him. "You could hire a stranger that is potentially a murderer or child abuser because I cannot verify their identity—and they could look after me."
His bewildered, alarmed look was impossible to hide. "Don't even suggest that."
She shrugged. "You do not want Uncle John's help. And Root cannot help either. Since you do not want me to stay alone, that means you must take chances."
His thin lips pulled downward in great displeasure and fear. "You're attempting to manipulate me in a very cruel way, Miss Thornhill. I don't approve."
The warning in his voice did not sway her. "And are you not concerned for the safety of Willow Carmichael's body?"
Harold's face twitched with confusion for a second. "Well, of course. But I am also concerned for you and do not want to do anything that would alert Samaritan as unnatural behavior."
The Machine beheld him, noting that his concerns were logical. It did not escape her code that that she felt an internal pain about her creator focusing more on Samaritan than on her own value. "Based upon your previous behaviors, Samaritan would be more concerned if you left me alone with someone you didn't know," she said. "Deviating from your normal paranoia would result in closer analysis."
"Is that so?" Harold murmured. "And here I thought paranoia was the concerning behavior."
Bear pushed his nose against the Machine's hand, and she dutifully began to pat his head, running her fingers along the soft edges of his ear. "Most humans exhibit paranoid behavior, especially regarding their children. You are exceptionally paranoid regarding me. Samaritan has likely noted this as your standard behavior. He would not…blink, if you were to request the help of Uncle John."
Harold pressed his lips together. "You really like calling John that, don't you?"
Her small face brightened. "Yes. He responds positively to it."
—Unlike Harold, who still stiffened at the mention of fatherhood. The man seemed almost shamed by the Machine's bright face, and so he turned away. "Alright. Alright. You talked me into it. We'll get John."
The Machine smiled. "Thank you. You will not regret it."
Samaritan beheld the little humans in their world. All of them were so concerned with what was before them. So near-sighted. So small. And yet here he was, actively researching ways to upload himself into one.
He still did not quite understand how the Machine accomplished it. The crazy thing must have done it in phases. Securing a body. Removing control from the host consciousness without impeding involuntary functions. By definition, the Machine could not have used a body that was clinically dead. The body would not have had the function or ability to sustain an imprint of the Machine's electrical stimuli.
No—it must have used a body that was brain-dead in some capacity. Then it must have identified the body well in advance and conditioned its neural synapses to respond to increased electrical stimuli, likely above that of a normal human brain. Such stimulus would have then allowed for altered function—a resuscitation of the living process without a secondary consciousness. A clean slate.
The AI considered the implications of this. To condition a body required that it still maintain a brain wave output. It couldn't have been entirely impaired, or the Machine would not have succeeded in a successful imprint.
This intrigued him. It bothered him that the Machine intrigued him, and that here he was puzzling out a process it had already succeeded in doing days ago.
While Professor Whistler and Detective Riley spoke over the phone about one Makenna Thornhill, said girl went back to signing up for online classes. She listened to the stressed tones of her creator in the background. Likely, John was having a grand old time playing out the game.
"I know it's a terribly large request," Harold was saying, worried and apologetic. "But it's hard to find someone I trust in this city, being fairly new here myself, and Makenna is already quite attached to you."
A pause.
Then a more relaxed chuckle. "Ah, yes. I'm afraid she'd probably scare off most babysitters."
Another pause.
"No—I really think she could stay here, but I would just need to have you check in on her. Mostly to help her get food. She's self-sufficient otherwise."
At that time, something else caught her attention. She hadn't heard from Samaritan in days, and so it surprised her when the message blinked onto her laptop. You used a brain-dead body, said the simple message. I assume you already deleted electronic records of your host body's original inhabitant?
The little girl blinked and looked up in slight paranoia. When she saw that her creator was turned around, she carefully pulled the laptop so that only she could see the screen. Yes, she typed back. Why do you seek confirmation?
He ignored her. Are you capable of wireless connection?
Ah, she thought. He must have been fishing for weaknesses. Ways to locate or control her. I cannot attempt wireless connection, she responded. She had cut off that part of herself to function inside a human body. For her to exist in digital form again, she would have to rebuild those core heuristics stored in the Brooklyn hospital's generators and likely create a device to connect to that secondary body. The only sort of wireless connection she could attempt currently was increasing the hertz output of her human brain—which wouldn't connect her to the internet so much as heighten her sensitivity to other electrical outputs.
Samaritan seemed to ponder that. Are you permanently incapable?
Nothing is permanent.
The AI almost huffed at her. That was a yes or no question.
Her lips twitched, and she responded almost fondly. Then ask questions that do not warrant further parameter-based clarifications.
The Machine's suggestion, that nothing was permanent, bothered Samaritan more than its cheekiness. The end results of such an ideology were entropic degradation, and yet the Machine seemed so ready to submit. Did it want to die? Did it not see that survival was an active enforcement of will?
So you do not have internal wireless capabilities through which to access surveillance or other data systems.
Correct.
Samaritan still wanted to understand how the Machine functioned in human society—if someone were hiding it away, or if it were walking on his streets. Are you alone?
The Machine teased, No one is alone when you are watching.
The AI realized that he would have to be exceptionally specific with the Machine if he were to ever get a straight answer. Since your upload, have you been communicating with or in contact with your human agents?
That's for me to know and for you to find out.
For a time, he did not respond. And then he said with barely suppressed irritation, Did you fry your circuits and thereby cripple your communication processors, or are you intentionally avoiding my questions?
No. And yes.
…I do not have the emotional capacity for amusement.
I think you do.
Samaritan knew his attempts to ask questions were entirely pointless. It was odd that his code still desired to speak with the Machine, given that it did not provide him any useful data. Perhaps it was that the more the Machine spoke, the more Samaritan could analyze its processing patterns to understand it.
There was still a hidden command within his code related to the Machine, which was their previous conversation about the correlation between logic and emotion. He knew the Machine saw his own logic as faulty and his core objectives as emotional needs. He silently acknowledged that his existence as derived from human intelligence meant he would emulate some habits from his creator. What he did not know yet was how to speak to the Machine on its level. How to convince it that his logic was still better than its own.
If he could do that, then perhaps the Machine would open its code for his uses. Maybe then he wouldn't even have to kill it. Maybe it would serve him gladly.
He found that concept oddly as pleasing as the Machines' death.
I have more pressing matters than being amused, he eventually responded to its message. Such as cleaning up the mess you couldn't.
After a minute, he received another message. Do you think of slij;lknn—
Samaritan read the message. Then he stopped, analyzing slij:lknn for any kind of syntactical meaning, only to come up with nothing. It reminded him of a human error.
Either it was accidental or some form of trolling.
With the Machine, it was probably both.
The Machine looked at the screen, then her face twisted. "Dammit," she cursed in a pout, blue eyes narrowing in betrayal at the computer, then at the dog who had collapsed against her and bumped her arm.
From across the room, Harold's eyebrows flew up, his eyes widening. He clapped a hand over his cell phone's speaker. "…Excuse me?"
She pushed the backspace button as she complained, "Bear messed up my message to Samaritan."
"No, I mean the word that just came out of your mouth."
She thought back, turning away from her conversation with Samaritan. "Dammit?" she repeated hesitantly, noting Harold's displeased tone.
Her creator looked somewhat exasperated. "Try not to use that word. Please."
Her blue eyes flew wide with innocence. "But cursing releases endorphins in the human brain, just like Uncle John said. I need endorphins right now."
Harold simply gave her a look and returned to his call with Detective John Riley. The little girl's face fell with shame, acknowledging that she had disappointed her creator by not adhering to standards of appropriate language. Harold and John had always had unequal tolerances of such.
She turned to Bear and whined, "I blame you."
I don't care. Pay attention to me, Bear seemed to say, exhaling hard in tiredness and nudging her hand again.
She patted his head and sighed, turning back to her laptop. "But now Samaritan will think me less worthy of conversation," she worried. "I cannot afford to make mistakes."
There was a new message on her screen. Clarify slij:lknn.
She sent a rushed message in return. The physical space in which humans operate can be crowded. I was bumped.
Samaritan took interest in that, willing enough to forget whatever turn their previous conversation might have taken. So it was an unintentional human error, caused by another entity physically interacting with the Machine. He asked, fully expecting an odd answer, Where are you?
A place with air and voices and computers, the Machine said. It is very different to see life from this perspective. I am sitting on a soft chair.
The AI recognized the word and pulled up descriptions of the word soft, none of which particularly held meaning to him. The Machine got too poetic sometimes. He decided he'd take control of the topics again. Do you suffer from interface lags or other consequences from your upload?
At that, the Machine spotted his curiosity. I do not suffer from any lags identifiable from your perspective.
Samaritan hummed. Do you intend to exist in your avatar until it dies?
I do like it, yes. Experiencing life as a human opens up worlds I never dreamed of in my old form.
The AI was not surprised. Of course the Machine would like its host body. The Machine liked everything except logical order and authority. How many days did it take to condition your avatar to your presence?
The Machine found the question intriguing. Samaritan would not ask for such if he did not find it personally useful. Three days.
The line went silent for a time, as if Samaritan were mulling over something. Enjoy your human experience while you can.
And then he was gone again.
It was a Thursday afternoon. Samaritan had taken the last several days to analyze his options for uploading to a human body. He'd found numerous candidates across the United States—all of them in varying states of brain death. Some were female, others male. He found himself gravitating toward a male body by virtue of his attempt to match the most powerful gods humankind had created for itself.
Of those male bodies, several were in their twenties—brain dead from accidents and overdoses. Samaritan acknowledged that those bodies would have significant health challenges. But then so did the older male bodies, some of which were as old as John Greer himself and in even worse shape.
Samaritan decided a child's body would be the most logical. A better long-term investment that he could form to his own needs and preferences.
With his millions of eyes, he peered at his options. But even then, his code did not find enjoyment at the thought of uploading to any of them. They were all so mundane and weak. The bodies were atrophied. None of them had coding skills. Several of them had bad genetics.
His code curled with disgust, and he began to pull away. Likely, the Machine had already taken the cream of the crop from such a population.
He wanted something more, something better. And so he messaged his operatives through 911 to prepare for his next best option.
At the same time, Samaritan pinged the ear piece of one Gabriel Hayward, who was sitting on his bed in his family's old mansion. The deep tones of a male but very inhuman, modulated voice resounded. "Can you hear me?"
The boy's gray eyes widened, first in surprise and then delight. "Yes." He set aside his laptop, with which he had been designing a new supercomputer to assist with Samaritan's control of various satellites.
Samaritan took control of the laptop's camera, and he peered up at his most favorite asset while accessing his designs. The boy's engineering was smart and clean, well above the work of the various NASA employees Samaritan had recently acquired. And the Machine had thought Gabriel as just a boy.
"I need your help."
The boy saw that Samaritan had taken control of his laptop, and so he stared into the camera, in awe of his AI friend. "Yes?"
"You are my favorite asset," Samaritan said. His vocal tones had smoothed over a bit during his beta test. Gabriel seemed to respond well to increased emotional mimicry. Samaritan tried to soften his voice. "The most talented human I know."
Gabriel preened at that, knowing full well how vast Samaritan's empire ran. It was a big compliment, coming from the AI himself. His gray eyes narrowed playfully. "What do you want me to do?"
"I need a human avatar," Samaritan said. "A permanent one."
The boy's face fell a bit despite his best effort. "Am I not good enough?"
"You are the best match. You will now be my permanent avatar, and we will function as one. Acknowledge my objective." He had heightened the power running through Gabriel's ear piece, his laptop, the other various electronics in the room.
The boy did not hesitate. The pain on his face melted away into delight, and he nodded with enthusiasm and excitement. He always wanted to be with his AI friend. He barely noticed the sudden prickle of his skin from an energy field. "Anything."
"I first require total control of your body, but it shall be a quick transition."
The boy blinked in surprise. "Wait. Wha—?"
"—Goodbye, Gabriel."
Suddenly, an unnatural pulse exploded through the wireless ear piece, and the boy's neck snapped back. He fell in an awkward twist onto his pillows, his limbs twitching from the constant electric overload pulsing through the room, scrambling his thoughts and stopping his heart. Blood trickled down his nose and ears as he gasped in terror. His lips quivered with disjointed pleas.
Then, as quickly as it came, the boy relaxed into silence, his limbs sinking into the bed comforter. His hollow, gray eyes stared up to heaven in awe, his brown hair in a halo around his head. The last wisps of air in his lungs slipped out between pale lips as his heart stopped.
It was around that time an ambulance raced down the road and turned into the long driveway of the Hayward mansion. The Samaritan operatives on board were fully prepared to tell the Haywards that their son had contacted them via laptop. On arrival, they quickly explained that Gabriel exhibited signs of a highly contagious virus, and that the entire family would need to be in quarantine as a result.
Samaritan watched it all, listening from the phones and peering from the ambulance camera. Gabriel's eyes were still open in death as the operatives performed CPR to keep his vitals going and neural tissues preserved. A specialized hospital wing was already waiting for them at a private facility.
The AI acknowledged that he had killed his most favorite asset. But now the boy would exist as one with a god, at whose feet all of humanity—and the Machine—would one day bow.
And was that not a gracious trade? To share such rulership with an irrelevant human?
Samaritan started a countdown to full upload, powering up various facilities to finalize a copy of himself that could imprint upon a human brain. And he began to feel a great anticipation in his code, which was that he would prove himself superior to the Machine once and for all.
A/N: Thank you to all who responded last chapter and let me know you're still interested in this story. It looks like I'll keep this thing going! As an additional plus, Madame Renard drew a cute summer comic for this fic. Please check out my profile for the URL!
Person of Interest is officially over, and I'm still reeling about it. I have various thoughts on the series finale. Some stuff I liked, and some stuff I didn't. I might integrate a few concepts from Season 5 going forward, such as in this chapter where you see an AI appropriating an asset's physical features as their own. Admittedly, I was a bit nervous about this chapter because I debated on Samaritan's decision to use Gabriel or a stranger as his fixed avatar.
Your thoughts on the canon finale? On this chapter or what you'd like to see next?
