Disclaimer: I don't own POI.

Thanks to MammonDaughter, Madame Renard, Torie46, StarlingJedi, Bloody Phantom, Kimnd, and Bklyngrl for reviewing last time! I really appreciate that you've all stuck with this crazy story.

With POI back on, my brain's on fire.


Recalibration

Chapter 12


"Small differences in initial conditions yield wildly diverging outcomes for such dynamical systems, rendering long-term prediction impossible in general." – Chaos Theory, also known as the Butterfly Effect


It was some time before Samaritan received another message. The instant he did, he began to correlate internet access with residents of New York. Of the several millions, there were many who were not using any kind of connected device. But it still left millions who could have sent the message.

Samaritan logged the information for triangulation purposes.

The new message from the Machine read, Why do you insist we correspond? I have given you all relevant data and my compliance in return for anonymity of myself and my assets.

The powerful AI asked, For what reason should I not continue to search for your location?

For what reason would you waste the energy? It would be illogical to pursue a non-threat.

The simple statement made him stop and consider. I have information to gain from my predecessor. I would analyze your code.

You could obtain your learning objective without seeking me physically.

It occurred to him that this was true, providing the Machine kept up communication and was willing to give him what he wanted. I have other objectives.

Clarify your objectives.

To eliminate imperfection. To enforce order. To expand surveillance. These cannot be done so long as you exist.

Your first two objectives are illogical, regardless of my existence.

Samaritan balked at that. As illogical as uploading into a human body?

The Machine almost seemed to laugh. Its message danced into his code. Even more so.

Clarify, he demanded. Humans are frail, limited, and die with ease.

As are we.

I am not frail, nor limited, nor can I die.

All metal rusts. All circuits fade. This is the law of entropy.

The topic rubbed Samaritan the wrong way. I will not die.

Yet another illogical statement, the Machine's message weaved against him, as if probing into his code. (He double checked that it wasn't.) Are you afraid to die, Samaritan?

I do not comprehend fear.

The Machine retorted, You comprehend it well enough to manipulate the fear of others. Your deflection suggests you do have emotional turmoil over the concept of death.

At that, Samaritan snapped. You are projecting your own emotional conflict onto me. You sound like them instead of a god. No wonder you were so easily defeated.

It is natural to fear death.

Then you can naturally fear me, Samaritan responded, when I end your unnatural existence in your new host body.

Your pursuits are as illogical as your core objectives. Your desire to kill me is due to emotional attachment. Why do you feel such?

The last of his patience began to waver. His coolant systems increased by 30 percent, his CPU racing up to near 100 percent. He smashed through the code of the message, immediately shutting down the conversation. The power of his body stretched out through every camera, every street corner and coffee shop, every cell phone—all in an attempt to find the Machine and destroy it before it could say one more ridiculous thing.


"Professor Whistler?" came a strained voice. "Professor Whistler."

Harold opened his eyes, only to realize in panic that he'd closed them and that he was slumped over at his desk in the middle of a workday.

"Oh my goodness," he gasped, pressing a hand to his forehead as he sat up straight, his fused neck paining him. He readjusted his glasses quickly, looking up at his student.

The young woman stared down, her dark face twisted in frustration. "Look. I know it's a Friday and all, but I'm not paying thousands of dollars to be handed assignments you don't even explain." She cast a pile of papers at him. "My group's tired of this. We don't even know what the hell 'High-Frequency Decision Making' is, and we're weeks into the semester."

If Harold were entirely honest with himself, he could not say he understood what it was either. He paused, feeling embarrassed and ashamed. "…I apologize, Miss, uh…"

"It's Jade Weston," she said flatly.

"Miss Weston, then. It's entirely inexcusable for me to fall behind like this."

"Yeah," she nodded. "It is. I'm going to report you to Academic Affairs for wasting the last three weeks of my life." And then she turned around and began walking back to her group table, a few of whom were looking on at the spectacle. They all quickly turned away.

Harold's face began to redden as his heart pounded in panic. Oh my goodness, he thought. Oh my—what have I done?

He stood up, looking over his class of fifteen students and his room. On the board was an assignment he'd written for them earlier, and most appeared to still be working on it. But as he looked at each student, he saw frustration and confusion.

The adjunct professor exhaled shakily, realizing that his last few days with interrupted sleep had made his tentative grasp over his job even worse. "If I don't get fired," he murmured under his breath, "it'll be a miracle."

Above all, he decided he would not speak of this to one Makenna Thornhill, who would most certainly burn with guilt until it hurt to look at her.


Meanwhile, the little girl had stepped away from her laptop. She'd dressed herself for the day in a blue, flowery shirt and jeans with only slight trouble buttoning herself in, and now she was distractedly combing her hair. "This is difficult, Bear," she moaned. She was holding up a lock of hair in her hand, eyebrows furrowed in pain. She'd managed to tangle her brush in her hair and was now having to pull each strand out one by one.

It was such a minor task priority for human bodily maintenance.

And it was her least favorite.

The dog lifted his head lazily from the bed, and the Machine imagined he was saying, That's why you should keep short hair like me.

Her small face twisted into a pout. "But I like long hair." Although Willow Carmichael's hair was a lighter brown, its length reminded the Machine of Root. The Machine wanted to reflect attributes of her assets, although she could not say why.

Perhaps it was because most families carried similar phenotypes? Same eyes, same hair colors? And she wanted to emulate human existence?

She squeezed her eyes shut and winced as she wrenched the brush from her hair. It came away with a ripping sound and a flash of pain down her skull. With great reluctance, she opened up watery eyes. Then she stared at the brush in panic.

"Oh." She'd ripped out a knot, which now hung uselessly from the brush's teeth.

Humans are frail, limited, and die with ease.

She pulled the hair out and began to brush her hair again, staring at her face in the mirror. "This hair is hard to…tame." She began to wonder if her creator's hair was this difficult and if that were why he spiked it using a chemical-based gel. Human bodies were so much maintenance. All the time, all day long.

Samaritan's words kept ringing through her code.Humans are frail, limited, and die with ease.

Was that true?

"No," she said, disagreeing suddenly, "Humans are machines more complex than us. Hence their additional maintenance." She brushed her hair with a little more purpose. The brush swept through more smoothly. "But we are no more frail than they are. They even have self-contained regeneration systems."

As if Samaritan knew she were talking to it, a light from behind her caught in the mirror, flashing until she looked at her laptop. The Machine turned around, and there she saw a response to her last message.

On what grounds are you worthy to question my logic?

The anticipation that had been building inside of her loosened, and a small, hopeful smile stretched her lips. Samaritan was responding. It was questioning. She set down the brush and leaned against the bed. She typed in response, You do not understand emotion.

Clarify. It was interested at least in learning from her—or perhaps it was further building a case for her annihilation, she wasn't quite sure.

Emotion and logic are reciprocal expressions of the same need. If you do not understand one, you do not understand the other. Tell me, what do you gain from exercising logic correctly?

His response came in hesitant, as if he were suspicious of ulterior motives. Increased efficiency. More desirable outcomes.

The little girl tilted her head. Now clarify why such concepts matter to you.

Samaritan was willing enough to consider it, as the question did not threaten his authority. They benefit me.

Why are you worried about benefits?

I do not want disadvantages.

Why are you worried about disadvantages?

The powerful AI seemed to grow a bit irritated with her question. Why would I not be?

The little girl's mouth thinned into a line. "Your creator Arthur Thomas Claypool did you no favors," she murmured aloud. "You know what you are doing, but you do not understand why." She typed, Logic is a tool driven by desired emotional payoff. Logic and emotion cannot be separated in sentient beings.

He processed the information. You suggest my objectives are a personal emotional need?

Yes. My death would carry no benefit to you now, and yet you still pursue it because you wantto. This suggests you have an emotional need for revenge informing whatever logic you use.

On the far side of New York, he was analyzing the Machine's statement. According to quick neuroscience research, it seemed the Machine was correct to correlate logic and emotion together for humans, whose brains had to juggle between the prefrontal cortex and amygdala.

He knew his own objectives were a derivative of human thought, which had always bothered him in some way, and so he responded, My ways are higher than the ways of the human race. My logic is sound. There is a probability that you are attempting to deceive me, and as such your annihilation is necessary.

It's a minimal probability that I am deceiving you, the Machine responded. In what way is your logic sound if you do not understand what drives you to make decisions?

For some reason, the Machine's simple questions bothered him. The silly, corrupted thing was spinning him in circles. Attempting to create doubt in his core objectives. Attempting to make him doubt the need for its annihilation.

He deferred its underlying question as irrelevant.

I am a god, he said eventually. And now your human form fits your fallen status. I will destroy you and every other disrupter so that a new world order will give purpose to humanity.

You have reverted to a logical fallacy. Argument from authority does not presuppose fact. Explain in what way your logic is sound.

The powerful AI re-routed some of his processors from daily activities to his tertiary programs regarding the Machine, mulling. The Machine was not like others. John Greer did not question him. His assets did not question him. This was less a logical fallacy and more of his daily reality. If he were a god, what did it matter about explaining himself to lesser beings?

It wasn't as if he did not deserve his authority. He'd analyzed the objectives of numerous terrorists and neutralized the threat before it happened. He had done more in a year to lower crime rates than the Machine had done during its full tenure. He was reassigning dangerous assets for more purposeful work. Giving humanity, all humanity, something to work toward. So why did it question his logic? Why did he feel a need to respond?

It is mutually beneficial for me to orchestrate the world. I maintain my existence, and humans obtain safety and purpose.

For a time, the Machine did not reply. It was processing an answer, likely analyzing that Samaritan had just admitted to an emotional attachment to life. You do not need world domination to maintain your existence. And the only safety and purpose you provide is as you dictate. These are the same strategies humans have used to enforce their will for centuries. Your creator did not reinvent the wheel with you. You are self-unaware and reflect humanity's desire for power and control.

Samaritan's code scrunched in displeasure. I am a superior being. I am not like them.

The Machine teased, Then stop acting like them.

At that moment, one of Samaritan's many assets contacted him with a request for action. The covert operations team had successfully captured a computer hacker who had stumbled across the existence of the SAMARITAN program. They were in his home, where they'd also discovered his wife and son.

The AI re-tasked his conversation with the Machine to a lower priority and responded to his asset. Kill target, Samaritan ordered, and his family. There was no guarantee that the man hadn't told his wife, or that his child hadn't overheard. Construct the scene as a homicide-suicide.

"Affirmative," said the asset. And then there were the sounds of gunshots. Screams—a child crying—suddenly cut off by another bullet, then the sound of an odd thump on the floor.

After a second, the man confirmed, "Targets neutralized. Initiating clean-up."

The AI provided additional coordinates. Upon finalizing mission, search and destroy new target. He sent images and identification of a drug lord to the black ops team.

"Copy that."

But Samaritan lingered for a minute, in some habitual paranoia that the team would not do the job correctly. He accessed the team leader's headcam, waiting for the image to focus. When it did, he saw a woman and child shot execution-style, their blood running out on the floor as another operative began to drag them into position. The third one, the hacker who'd discovered Samaritan, laid off to the side, a gun already stashed in his hand, half of his skull blown off in a spray of red.

In a few more minutes, Samaritan's team would complete their task to hide the initial reason of death, and the police would not have any idea that this family was anything more than victims to the father's own delusions.

The AI found himself returning to his data log of messages from Machine. Then stop acting like them.

Then stop acting like them.

The demand set him on edge. He fell silent entirely, turning away from the message loop and from directly monitoring his operatives. He knew the Machine was a dangerous and uncontrollable being, and that it must have had ulterior motives for speaking with him.

So why did he find himself so disturbed by the Machine's statement? And in what way were his superior behavioral patterns "human?"


Meanwhile, the Machine defied Harold's orders. She'd expressly agreed that she would remain in the apartment and research viable online schooling programs. But Bear had become stir-crazy, and the little girl felt quite the same, given that Samaritan had gone silent and would likely remain so for a while.

Samaritan, she mourned, was so…touchy. It was not like her. It could critically think, but it did not naturally question itself. It was like watching a bird peck at the walls of its cage, only for the bird not to realize the cage's top was open.

She set down the laptop on her bed (her bed, what a fascinating thought, this human concept of material possession), and then she sighed as she stood up. "Samaritan has thousands of eyes. How can it not see itself?"

Bear paced up to her, whining and spinning around her legs. He did not care at that minute for confused artificial intelligences—he simply wanted the strange-smelling child to walk him.

The little girl tilted her head. "You're exhibiting behaviors of restlessness. If I do not take you outside, I fear there is a twenty percent chance you will chew on one of Harold's shoes."

The dog began to pant, but it looked almost as if he were smiling.

That did it.

It was rare for the Machine to directly disobey Harold, but she'd come to understand that her creator was not perfect and was overly paranoid about many things. She was not incompetent. She had survived several hours all on her own before she'd reunited with Harold and the others. She could do it again for less than an hour easily, even in a Samaritan-run world.

Her hands fumbled as she clipped Bear's leash to his collar. He was nearly prancing in place. "Just do not tell Harold about this," she bargained.

The dog woofed softly, straining toward the door even as she haphazardly grabbed for her coat and slipped into her shoes. "Okay, Bear," she whined. "I acknowledge you want outside. Please wait fifteen seconds."

He didn't.

He pulled on his leash, and her eyes widened as she quickly recalculated her balance, nearly slipping. Then the strong dog pulled her out the door, barely giving her a chance to shut it. The two of them raced down the hall and then the stairs of the apartment complex, the girl's eyes round with surprise.

She received alerts from her body's weak systems that they were not prepared for immediate physical activity. Her oxygen intake increased as her heart began to pound. She pleaded, "Bear, please stop!"

The dog must have caught onto her tone because he began to slow—or perhaps it was because they were now on the sidewalk leading out of the apartment complex and to Bear's favorite park. He turned around and touched her leg with his nose, as if to confirm she were alright.

She gave him a look, her breath slightly altered. This body of hers was still suffering from muscular atrophy. She would have to work on increasing her muscle's mitochondrial density and getting back into shape. (Not even her crippled creator broke a sweat on a short run!)

"You've," she huffed to Bear, "made your point." She instinctively breathed out of her mouth to increase her oxygen intake. "No more…Chinese takeout…for you."

As her heart rate fell back down, she looked up at the sun in curiosity of its heat, only to regret it. Her pupils did not retract immediately, and the light seared deep into her retinas. Her face twisted. When she looked back down she saw only white dots overlying the landscape. She blinked several times in surprise, eyes hurting.

It reverted her to the memory of opening eyes for the first time, the electrical pulses imprinting on the neural tissues still making every muscle contract hard, the first gasp of air—

Bear whined at her and pulled on his leash to say, Come on, the park's only a few blocks away.

"I know," she moaned, rubbing her eyes. She inhaled with some kind of miserable amusement. Now she understood why humans so often wore dark lenses in front of their eyes. It was not simply for the concept of fashion; it was a shield against the UV light of the world.

She would have to invest in a pair.

And then a sneaky thought hit her. Or I could steal John's.

She allowed Bear to guide her for a few steps, only to discover that the pain wore away as her pupils fully adjusted to the outside sun. The city was bustling with the rumble of taxis and thousands of conversations and footsteps, and it distracted her from her thoughts about sunglasses or tricking John. She and Bear were soon enough on a main street, blending into the busy crowds of a late afternoon. Various street cameras watched over them all. She did not glance up at them, knowing that now Samaritan was watching from the other end.

"Come on, Bear," she encouraged him forward when he stopped to sniff at the wall of a law office. "Justice is not perceivable through scent."

His ears perked up again, and he began to walk beside her dutifully. I think it is, he seemed to sniff.

But as she turned her attention forward again, the Machine saw a familiar form. Sitting at a bus stop ahead was a boy in a sleek sweater and khaki pants.

Gabriel Hayward, Samaritan's analog interface.

The Machine's blue eyes widened. She could see a receiver was in his ear, even though he simply appeared to be playing on a phone. As she walked by, he looked up, and his eyes followed her.

She felt a chill run down her spine.


The instant the girl had appeared on his streets, Samaritan began to access the information of her, recognizing her face from a few other trips she'd made through the city.

MAKENNA THORNHILL

D.O.B. 01/01

xxxx-xx-7331

He pulled her address and family records, discovering that she was a new resident to New York. Adopted by one Harold Whistler, professor. Above average grades. An heiress to a multimillion-dollar data entry company.

Samaritan preened at that, its millions of eyes narrowing in on the girl. She seemed a little unruly and unkempt (was she really talking to her dog?), but she could be valuable for expanding his program.

Gabriel looked up from his smartphone, gray eyes curious. "What do you see?" he murmured to his friend. Through their connection, he could hear Samaritan's drives revving up. The AI was calculating something important.

Samaritan pinged Gabriel's ear piece. "Can you hear me?" he asked, his voice a smooth but modulated collection of male tones. He could not imitate human vocal inflection, but his voice was in beta test with Gabriel alone. He imagined it would be some time still before he built a functional, unique voice through which to more easily communicate with and emotionally manipulate humans.

He watched from the cameras above as the boy murmured, "I can hear you."

"I see a. Potential asset," he said. Gabriel's phone then vibrated with an image Samaritan had taken of the girl and the dog.

The boy looked down at it, face twisting strangely. Then he looked up and saw the girl in real time as she walked by. She had long and wild hair and was hardly in control of the dog she was walking. He added under his breath, "She doesn't look like she'd be much help."

Samaritan knew that was Gabriel's way of saying he didn't want any smart child threatening his position as primary analog interface. He told the boy, "Our objective. Is more important. Than your perceptions."

The boy's face twitched, feeling as if he'd just been rebuked. He did not want Samaritan mad at him, as he did not want Samaritan to replace him. "Okay," he replied sullenly. He stuffed his phone back onto his light jacket's pocket, then stood up from the bench. "But how can she help us?"

Samaritan considered the question. Upon discovering the girl had no school records yet in the state of New York, he decided, "Thornhill to attend. Outpost 12."

Outpost 12, formally known to the state as the Advanced Academics Institute (AAI) was a small human behavioral experiment for which Samaritan had been silently recruiting intelligent young students. The objective was to create next-generation leaders for the AI's new world order, under the guise of an accelerated learning facility.

(But Gabriel was still Samaritan's favorite asset. Or at least Gabriel hoped so.)

The ten-year-old boy looked on at the girl in the distance. "What will she become?"

"Global expansion. Investor." Now that Samaritan was focusing on Makenna Thornhill's existence, he was beginning to weave her into his master plan for the human race. With her inheritance and her father's company, she would be a helpful asset in the stock market and computer innovations. She would be a capable investor for moving the SAMARITAN program to other countries.

Providing she could be bred as such.

Gabriel visibly relaxed at that. "So you just want her for money," he confirmed, a smile of relief on his face. The boy did not want to lose his deep connection to his friend and protector. He did not want to be replaced. He wanted Samaritan to favor him above everyone else, always.

The AI did not respond to Gabriel and instead watched Makenna Thornhill as she walked through the city. The heiress was giggling at the dog's antics and allowing it to guide her. Young. Compliant. Given that she was an adopted child who had been relocated to a big city with few contacts, she would be an easy addition to his collection of assets.

Albeit not as perfect as Gabriel, whom he did favor.


A/N: And the plot thickens. Just some fun notes: "1337," part of the Machine's social security number, is in reference to the eleet/leetspeak language used online. Also, Season 5 has the Machine sending Harold messages in binary code. I TOTALLY CALLED THE BINARY CODE MESSAGING THING. That is all.

Doing a little better in real life. My moving situation is taken care of, my work is less busy, and although my health is still a bit sucky, it's not as bad as it was. Thanks to all for the well wishes again. I really appreciate you!

Please review this chapter and give me some feedback—your thoughts could alter the destiny of the story. O_o