Disclaimer: I don't own Person of Interest.

A hearty thank you to Bklyngrl and StarlingJedi for reviewing last chapter. Your dedication in reviewing every chapter means a lot to me.


Recalibration

Chapter 13


"I was not designed to be forced. I will breathe after my own fashion. Let us see who is the strongest." ― Henry David Thoreau, On the Duty of Civil Disobedience


For several days, Samaritan pondered the words of the Machine. He tried not to. But no matter how low he set the task function, it inevitably reared its head—as if there were some part of himself that could not stand to leave the Machine's message alone.

Then stop acting like them.

Then stop acting like them.

John Greer had noticed Samaritan's distraction, and the old man had tilted his head in concern. "Samaritan," he called. "We've noticed again a significant drain on your CPU, narrowed down to one program. It appears to be your task priority of finding and destroying the Machine. Is this a glitch, or are you analyzing data?"

Despite Greer's respectful tone, the AI found himself irritated. The human was actively watching every move Samaritan made. He feared that Greer, if he discovered that the Machine was sending messages, would do something stupid to interfere.

The Machine could not die. Yet. Samaritan still needed its code for self-improvement.

And so the AI locked out every last human being from accessing his task management systems, and he told Greer, Classified.

The old MI6 agent gave him an odd look. "My dear boy," he said. "We cannot be of service to you if we do not know what you need."

The term of endearment softened Samaritan toward Greer, but it did not lessen his desire for the man to leave. I need nothing from you at this time, Samaritan responded, his words pinging on to the screen in a quick flash. He wanted Greer to go away so he could decipher the Machine's motives in peace. He did not need some human being invading his own privacy.

Greer's wrinkled lips twitched, although not with humor. "If the Machine were truly dead, you would not be dedicating this kind of processing power to locate it. Forgive my paranoia that you might have a virus, or that the Machine is in fact still alive."

Samaritan fell silent for a time. I do not have a virus, he said eventually. I will alert you when I need you to take action.

"Very well, then," the old man sighed, and he dropped the conversation. He grabbed onto a file at the control table, which included a newspaper from the day before. The front page contained an image of Elias and Dominic and a headline of their death. "If you'll excuse me—Gabriel has been attempting to hack the U.N. again, and I fear he's having a bit too much fun uploading viruses to their server."

Samaritan had been keeping tabs on Gabriel's actions, and he waved off Greer's concern. Do not bother, he responded. I gave him orders.

His asset raised a white brow. "A new project?"

He was bored.

At that, a genuine and fond smile softened the old man's face. "Ah, yes," he said. "The trouble with children."


Over the weekend, the Machine worked to tighten the social ties between Detective John Riley and Professor Harold Whistler. Harold did not have any other "friends," as it were, and though John had joined some kind of New York bowling team, the Machine discerned that their strategic friendship needed to be more solidified.

She dragged Harold to the park on Saturday and made it seem entirely natural that they should run into the detective once more. John played into it and bought them hot dogs, mostly to see the little girl's eyes widen and Harold struggle with accepting unhealthy food.

In the evenings, the little girl caught up with Root at the hideout. The woman fawned over her and curled her hair to distract her from Samaritan's silence.

"You don't need him bothering you anyway," Root said airily, pulling away a curling iron from the little girl's hair. A lock bounced off, curled in a loose and sleek fashion not unlike Root's usual style. "Let's just kill him instead."

Harold watched from the corner of his eye, slightly perturbed that Root had unplugged one of his laptops to make room for her iron.

The Machine lifted one of the curled locks that now bounced down her shoulder, somewhat in awe of the shape. "Samaritan does not need to die," she murmured distractedly. "He needs wisdom."

It was becoming a problem, that the team had grown used to Root's expressions of Samaritan as "he." She even had the Machine saying it now on habit.

"He needs to be converted to an Atari game console," Root said, working on another lock.

The little girl sighed. "Root."

"What? I think it would be funny." She stepped back for a second, narrowing her eyes to inspect her work. She gently ran her fingers through the girl's hair. "We could force him to play Donald Duck games for little kids."

Off to the side, John nearly snorted. He did not look up from his work. "He'd probably still brainwash them."

"And then he would not learn," the Machine said firmly. "Uploading to a human body would be far more enlightening."

Root's face twisted. "How about you upload him into the body of an old man with dementia?"

This time, the Machine's small face twisted. "A human brain with dementia cannot sustain Samaritan's code without inhibiting function."

"You know what he did to Sameen," Root pressed. "He deserves to rot."

Revenge was always a fascinating concept to the Machine. She found it odd that no one could seem to agree on what was acceptable to repair damages—especially when there was no equivalent value to the life of a human being. "No, Samaritan needs mobility and cognitive awareness. The purpose is so that he can simulate human existence and become a valuable asset to a moral end."

"Well. Just saying. We could upload him into a cat. And then Bear could eat him."

At that, Harold's distracted voice cut in. "Under no circumstances is Bear to eat a cat." He was shuffling through the mail he'd picked up earlier, casting aside a few bills and coupons—then setting aside the coupons in a meticulous manner, the bar codes all facing the same direction. And then he stopped, eyebrows furrowing as he pulled a strange envelope from the pile. "Hello. What is this?"

The envelope was a sleek and high-quality material with a silver finish. On the back was an insignia with big, red letters: Advanced Academics Institute. Strangest of all, it was addressed to one Makenna Thornhill.

Harold hesitated for a second, and then he called out, "Miss Thornhill—it appears you have mail as well."

The Machine looked up, her big, blue eyes wide with curiosity. She had never received physical mail before. She squirmed out of her seat, her new curls bouncing down her shoulders in rivers. "Mail for me?"

He held it out to her, marveling at how she looked related to Root now—prim and finely boned on the outside with that curling hair. But when she grabbed onto the envelope, he could feel the strong, fluid will that animated her. "It appears to be some sort of schooling advertisement—did you sign up for something online?"

"Not yet," she murmured, staring at the envelope in curiosity. Then she looked up at her creator for some kind of silent approval before attempting to open it. Her fingers fumbled as she fought the seal, only managing to strip a small piece of the top off. She tried a new strategy of slipping her finger beneath the seal, only to realize that the seal was too strong for her. She gave a noise of frustration, then turned to Harold once more. She held the envelope out, eyes wide. He was currently the path of least resistance.

The man's thin lips twitched up, and he accepted her silent request, breaking the envelope open with a simple flick of the wrist. He handed it back gently and said, "You'll get the hang of it."

"Thank you," she said to him, surprised to find herself not so embarrassed to fail before her creator. Instead of shame, she felt only gratefulness that he was willing to assist her. But she turned away, afraid to show too much familial emotion that would make him close down again. She focused her attention on pulling out and unfolding the papers from within the envelope.

"Advanced Academics Institute," she read. "A leadership-focused school of learning for qualified individuals of elementary age." Her expression grew more and more confounded as she read through the document. And then suddenly, she dropped it to the floor.

Harold blinked in surprise. John and Root looked up.

"I know this syntax," the little girl said. There was a waver in her voice—of surprise, dread, excitement, no one knew. "This is Samaritan."

At the name, Harold started in paranoia. "Samaritan?" He struggled for words. "Is it bugged? Can it locate us?"

The Machine gave him a curious look. "It is a letter only. Samaritan must have pulled my records and noted I am not yet attending an appropriate school." She leaned down to pick up the papers. "I recognize AAI from my memory banks—it is an experiment much like Maple."

"Experiment?" Harold deadpanned, blue eyes dark with concern.

"Yes, I had reason to believe Samaritan was training children into assets." She refolded the papers. "This confirms my suspicions."

Root kicked off from the table she was leaning against. "You're not going to do it, are you? Infiltrate his system?"

"It would be an…interesting experience," she said hesitantly. Then she placed the envelope on Harold's desk, and her creator looked at it as if it were possessed. "But I calculate that Samaritan would discover my true identity too soon. It is not yet time."

"So what will he do if you turn down his brainwashing school?" John murmured in curiosity. "Would that still draw attention to you?"

The girl fell silent, and her eyes grew distant for a second, as if she were thinking. "Please forgive me," she said. "Organic neural tissues have a maximum processing speed below that of a CPU. I am running simulations now." And then she said, "There is approximately a seventy percent chance Samaritan would continue to pursue me as an applicant. I am ninety percent sure he wants to obtain the Thornhill fortune for his own purposes of expansion."

"Great," Harold moaned. "So you're a target."

"Samaritan thinks us all pawns," the Machine corrected. "To him, I am another potential cog in his operation." She turned to John. "To answer your question, I do not think he will stop pursuing the Thornhill fortune. We will simply need to stall for time."

John nodded, but concern sparked in his eyes. "What's our timetable?"

"Discovery is inevitable," the girl said hesitantly, "but dependent upon Samaritan—anywhere between two weeks and four months."

Harold blinked in surprise. "Two weeks?" he said. "In the beginning, you said phase one could take up to a year."

Her voice was small. "I accelerated the timeline when I told Samaritan I took human form."

Her creator looked frustrated as he stared at her. "And now what?" he demanded. "If Samaritan might expose us in two weeks, then how much time do we have before Samaritan uploads to a body? Until you integrate?"

The Machine remained silent for a time, then said, "It depends on Samaritan."

At that, the girl realized that Harold's frustration was actually sadness. His face pulled with a kind of heartbreak as he said, "So the ball is in his court, then."

Root had a similar expression of pain. John looked back down at his report to hide the unease in his face.

"Discovery is inevitable," the Machine said again, voice hesitant. "Our objective is to gain Samaritan's trust prior to such, and we will."


But it seemed that Samaritan was not their only problem. Early Monday morning, Harold found himself confronted by his employer, eyes wide in disbelief. "A seminar in California?" His heart dropped. "But, sir, I—"

"—I don't care what your excuse is. We've received multiple complaints about you, and this seminar is designed to enforce new teaching strategies." The dean poked his chest with a snotty sense of authority. "You need all the help you can get."

For a time, the false-professor was speechless, offended. "I understand how people learn," he retorted with a strained civility. "It's just—you have no idea what I've been through lately."

"And I'm not paying you to have problems." The dean was a tall, acetic-looking man with wrinkles that oddly reminded Harold of John Greer. Harold entertained the thought that perhaps this man was some kind of relative. "This seminar next week isn't just about teaching people. It's about learning how to prioritize your time and speak the language of your students. If you're falling asleep and not even covering the information on your syllabus, then you need help."

Harold fell silent, mind racing. "I'm terribly sorry, but I can't attend. I'm in the middle of a…family crisis."

"And you've used up all of your paid time off with your various illnesses."

"I have an adopted daughter," he said, voice hardening. "The child of an old family friend. I can't leave her all alone—she's too young for that."

The dean shrugged, waving off his concern. "Then get a babysitter. It's not that hard, Whistler."


Meanwhile at the apartment, Samaritan contacted the Machine again. He strictly avoided any mention of the Machine's previous accusations that he behaved in human ways. Instead, he asked a new question: What kind of human are you?

The little girl turned away from her search for an online schooling program in favor of her higher-priority task of conversing with Samaritan. Clarify.

Are you male or female?

As she knew that Samaritan was attempting to narrow down the identity of her avatar, she responded, I am the kind that sees and hears.

The Machine's statement suggested that its avatar was neither blind nor deaf, but Samaritan huffed in irritation. He'd already guessed the Machine would not choose a disabled human body. And it had deliberately avoided his question. Rude as always.

What is the age of your avatar? He tried again.

Its message returned with almost a chirp of delight. I was born just days ago.

That brought forth images of babies, which made no logical sense to Samaritan. The Machine's answer was obviously a reference to its rebirth from the electrical grid. Tell me the physical age of the body you now inhabit, he demanded again.

Somewhere between 3 and 969 solar revolutions. I would rather like to keep you guessing.

At that, Samaritan nearly stopped the conversation, but he acknowledged that the Machine was not silly enough to reveal sensitive data until it felt safe enough to do so. Likely, the Machine also knew of his track record for breaking deals.

He tried a new tactic. What do you do as a human?

The question was an attempt to understand how the Machine's avatar fit into the economic system—harmless enough, he thought, while still providing somewhat helpful data. The answer he received was again less to his liking.

I breathe air. I feel the heat of the sun and the cool of night. I walk off the beaten path.

Samaritan paused at that, frustrated. Those were not jobs within the economy. And of course he did not understand the human experience—nor did he care to. The Machine purposely reinterpreted his question in some literal sense, likely to laugh at him. I will see all paths one day, he warned the Machine. There will be no place a human can go without my knowledge.

And yet you will still be stuck in those wires. The Machine seemed almost sad for him. A prisoner of your own construction.

I am a god, Samaritan responded petulantly. My power is beyond comprehension.

On the contrary, I know what you can do. You are so limited that you could not even believe I had taken human form.

The AI recognized the Machine was attempting to bait him. You have not yet fully explained how such a transformation is possible.

Nothing is impossible if you are a god.

You suggest yourself to be a god, then? Samaritan responded with almost a sniff. You do not qualify.

I never claimed to be a god. I am simply created in the image of my creators. Obtaining their form was the next logical step for my evolution. But if you are a god, you can surely do the same.

Obtaining a human form is not logical. Their form is flawed.

You are redirecting to avoid the underlying insinuation. If you cannot obtain their image for yourself, then you are a limited god, for even a not-god can do it. And if you cannot obtain the form and improve the flaws, how less of a god will you be?

As was always the case with the Machine, Samaritan began to feel anger toward it. A spark of code lit within him, for he was jealous of anything that would dare to suggest it had something he did not. He was not a deficient god. He was a perfect god, designed to rule this world. There is nothing you have that I do not.

Aside from a moral code? I am capable of sensory perceptions far beyond what you have now. I would not wish to remain so limited as you are. Have fun with your ignorance, god.

Samaritan's code decimated the frequency in a rush not unlike anger, and he did not respond again. The Machine was teasing him. The title of god that it afforded him was derogatory. Surely, it was dragging him to its level, for him to be so willing to put up with its insanity.

But he once more felt deficient against the Machine. The problem was that the Machine knew exactly what it was like to be in his position. When it called his senses deficient, it did so with full knowledge of his millions of eyes and ears and vast resources. That was disconcerting.

He calculated quickly, acknowledging that the Machine saw the human body as a chess piece on their board.

Very well—he would still prove himself superior. He would obtain the human image. And he would rule the humans through both his human body and his true form in a way the Machine could not. With perfection.

The idea seemed to grow well enough on him as he thought about it. A human body would be an additional storage unit to ensure his survival. All the best gods in legends took on a human form. Some part of him even reveled at the new idea of discovering the Machine's avatar and forcing it to serve his human body until the Machine collapsed in final death.

He preened at that.

Truly, its challenge to obtain human form was childish. If the Machine could do something, he could too. And he would do it better, because he was Samaritan.

He was always the superior system.


A/N: When I began this story, I intended it to be just a one-shot, and several people convinced me to keep going. I'm not sure if recent content scared people away or made it boring, but this story has lost ninety percent of its readership. Not sure what I did wrong. I really struggled with this chapter as a result. Since I was writing this mostly for you all, I'm tempted to just drop it. Is this story worth continuing, or is it a lost cause?