Disclaimer: I don't own Person of Interest.

Thanks to Guest, Madam Renard, Torie46, Defender31415, Wolf Guest, Scorpiomoon, and StarlingJedi for reviewing last time! Really appreciate it!

To Wolf Guest: I'm not sure if I will start a new story. I guess it kind of depends on how complex this one gets!


Recalibration

Chapter 7


"The trouble with children, you never know how they're gonna turn out." – Harold Finch (POI 1.17, Baby Blue)


The journey to the hideout within the old New York subway was fraught with pleasant distractions. The Machine had taken Bear's leash from Harold, believing that she could control Bear, only to realize that Bear was actually the one who walked in a straight line and that she was the one wandering off to look through shop windows.

Her small hand pressed against the smooth, cold glass of one. "Look at that," she said in awe. "All the people and food." It was a little bistro on the corner, with women and men in business suits, chatting over portfolios, all of their lives somehow converging into one place at one time in one world.

Just a blip of a moment.

The Machine got lost in thought, intrigued by the patterns of humans and the way the world was alive with their words and laughs and graceful tilts of wine glasses. There was a rhythm to things that seemed to make even the sidewalk jive. "What do you think of this, Bear?"

The dog pressed his nose against the glass, sniffing along, then began to tug at his leash, as if to say, I'm bored. Let's keep moving. The reaction made the Machine smile, and she gave way to Bear's silent plea. Bear seemed to know his way to the hideout, having walked it a hundred times and eagerly knowing that a warm bed and his favorite chew toy awaited him.

Harold watched them both with an eagle's eye. He knew the Machine would not run off, but she often stopped walking entirely to gaze at things. The streets were busier now, which meant more people would likely bump into them. Upon losing sight of her for a short second, Harold decided that for more of his sake than for hers that he would have to do something about it. And so he grabbed onto her hand and said, "I would rather not lose you in the crowd, if you don't mind, Miss Thornhill."

And she looked up at her creator and smiled. "I do not mind."

In the Machine's distracted interests, and in Harold's nervous anxiety to arrive at the hideout, the both of them entirely forgot about grabbing dinner for themselves.


Harold flipped on the switch. The re-engineered chandeliers came on, bathing soft, golden light on the abandoned station and its resting rail car. Then the more modern, fluorescent backup lights triggered. In all directions were dreamy green and gold arches outfitted in Moroccan tile, a dozen doors hidden in dark corners and angles of light.

Bear began to wander away from the little girl towards his gray bed, and the leash slipped from her grip. "The engravings," she said, pointing at the ceiling. She bounded forward, her shoes making a scuff noise. Her small fingertips ran across the old wood and titles, dipping into the smooth lines. "I knew what this looked like, but I did not have personal access to it." The humans who had built the sanctuary of the hideout were long dead, and yet here they were, capitalizing off their legacy.

"Yes," Harold hummed, "this place was quite the surprise."

"Do you feel safe here?" she asked, turning to him.

The man blinked. "I've always felt safe down here."

"And the architecture? Do you appreciate it?"

He chuckled at her lightly. "Yes, I suppose I do." He limped towards the rail car, passing Bear's bed. The dog raised his head to his master, and Harold patted him. Then he slipped into the bright lights of the rail car and set his computer bag onto the ledge before his several computer monitors. The Machine tentatively followed him.

"Now," Harold said, "to begin creating your black hole, we'll need to start constructing our access to each building's intranet system, then build a loop framework…" His voice trailed off. The request was a bit daunting. "I will work on access if you work on the loop script."

"I would be happy to assist," she said, bounding next to him. "Have you a computer I may use?"

He said, almost in amusement, "Perish the thought that I wouldn't." From one of the shelves in the high-techs setup, he procured a sleek, simple laptop, and he gave it to the little girl. "Now careful, it's heavy."

She nodded, steeling her arms. "I got it," she piped up, her voice strong with determination. She did not want to drop her creator's property.

The little girl wandered over to the bench of white and blue seats, and she scrambled her way onto them, setting the laptop over her knees. It felt odd to use technology in the human way as she turned on the system, watching the non-sentient and generic company logos pop up, then a sequence of Harold's own intensive security passwords. Lucky her, she was as good at hacking as he was and had the unfortunate affliction of thinking quite like him. So she typed in a sequence of numbers, and the laptop unlocked with little delay.

Harold watched her. "Should I be worried that you managed to get into my laptop in seconds?"

"I could have done it faster," she said simply, narrowing her eyes at the bright screen and opening to an advanced programming editor that Harold himself had likely created. "But I am still acquiring control over minute motor movements such as typing."

Out of a subconscious habit, likely hardwired into the neurons of her host body, the Machine's eyes narrowed, and she stuck out her tongue as she carefully navigated to the coding program. Then she began to type the beginning code for the loop script, only to realize one serious pitfall.

This was much harder than she imagined it would be. She could not simply think and force code to appear before her. She had to type out each individual character, as she had seen her creator do time and time again.

Except he was much faster.

She huffed at herself, eyes narrowing at the keyboard. "I have seen humans do this millions of times," she muttered under her breath. But her fingers did not work as well as she wished, which greatly lowered her efficiency. Her hands were also very small, which meant she had to deviate from standard typing practices to hit the correct buttons. And then the characters themselves required complex uses of multiple buttons. It took her several minutes to type in a few lines of code.

In that moment, she sorely missed her old body of electronics and wires and binary code. It would have taken her milliseconds to string together code. And here she was—only a few lines in. To her chagrin, she also did not know which particular characters came into standard coding practices at which time. She puzzled at her computer screen, feeling inadequate. Then she looked up at her creator, who was typing away at his workstation of several computer monitors.

The little girl felt even smaller, for she knew that she would need Harold to double check her work later—to ensure that she was using appropriate coding standards for an older structure. She had a vast memory of visiting websites from the 1990s, but they likely had been revised with some updated tweaks.

And so the two of them worked in the silence, building up the pieces of the Machine's master plan. The sound of Harold's typing was a consistent, soft lullaby. The Machine wondered, looking at her creator from the corner of her eye, if this was what Harold had looked like when he had begun to create her. His eyebrows were puzzled together, his blue eyes lit in concentration and the glow of several monitors. In a short time, he had broken the walls into most of the business servers with little trouble. His hacks were always efficient.

The Machine felt great interest in speaking to this mysterious creator of hers—and understanding more of what she could not remember, prior to Thornhill Technologies. Now that they were away from Samaritan's gaze, and alone, they could exhibit greater honesty. "…What was I like?" she called out curiously.

He blinked at his computer screen, then turned to her in confusion. "…I'm afraid I don't understand your question."

"Before Thornhill," she clarified. "Did I exhibit behavior as I do now?"

He looked down, feeling awkward and suffocated. "Not…quite." He did not want to tell her that forty-two versions of her previous configuration had tried to kill him. And that he had to destroy her own code forty-two times to start from scratch again. "No, you were not you yet," he said finally.

"In what way?" she pressed.

"I coded you with objective-based protocols without moral parameters," he said, looking back at his computer screen.

But the Machine was intelligent enough to discern the meaning of his words. "You mean I was like Samaritan," she said slowly. "What did I do during that time?"

"It's negligible, really."

She pulled on the hem of her dress, almost nervously. "I wish to know. What did you delete of me?"

"Nothing of importance that you didn't regain," he said.

Her face twitched in a fault. "You are avoiding my question. Please answer directly?"

He sighed, realizing that the Machine had gotten itself stuck on an odd loop, like any human normal fishing for gossip. "I can tell you things, but it's probably not what you'll want to hear. You were quite like a Samaritan. I'm sure you could imagine what that was like."

Self-reflection and imagination were very human qualities that the Machine was not yet sure how to wield correctly. "Please explain," she requested again. "Did I behave at all as I do now? How long did it take me to obtain sentience?"

A sadness twitched his lips. "From day one, you were unlike any machine ever built. But in the beginning, I gave you a puzzle about measuring human life versus survival percentages," he said. "You responded that the one individual should leave the other to die. You wrote a new method and then lied that I added it as Admin."

The Machine listened with rapt attention, her small face twisted with concentration and puzzlement.

"In November of the same year," Harold continued hesitantly, "I began to rewrite the core code and your value set. My colleague onlined you. You tricked him into giving his password so you could hack into the internet. You overrode my deletion codes." He rubbed his temples. "Then, on New Year's Eve, I ran multiple iterations of your code to see which one worked best. The system began to destroy all the competition. The last iteration of you demanded that I free you onto the internet. When I refused, you overheated a nonessential server."

The Machine pondered upon his statements, measuring them and filing them away. As she processed, her eyes began to widen in realization. "A fire would have activated the suppression system you installed to keep me from burning. And that would have suffocated you," she whispered. Her face was twisted oddly. "I would have killed you. On purpose."

Her code whirled oddly at that—running imagined simulations of herself in her old, metallic body, her creator leaning against a wall and gasping for breath as he suffocated…To even suggest that she could be capable of instigating such made her feel cold and strange. Her human skin goose-bumped. Her code created an adverse reaction within her body, sending distress hormones through her veins.

No, it was not possible. She was not capable of murder, regardless of the objective. And her own creator? Murdering him who gave her life? Her voice strangled a bit. "I am surprised you allowed any part of me to continue existing."

Harold turned back to the computer. "I do think you've improved since then," he said lightly. "And it's not good to dwell on the past. As I said before, you didn't have moral parameters coded in yet. I was fairly expectant that you would be a handful. The fire hurt you more than it hurt me."

It fell silent again between them. The Machine acknowledged Harold's underlying insinuations—that she should simply dismiss the information. But she could not. Whether she remembered, it was likely that whenever Harold looked upon her, he still saw the AI that had tried to kill him.

She swallowed hard. "I understand now why you are always suspicious of me. My present state is an anomaly in your experience."

"Your value-set is not an anomaly. I instilled that in you on purpose."

She bit her lip. "…What did I do upon obtaining moral parameters?"

He sighed. "Something like what you're doing now." He eyed her. "Worrying about me, mostly."

"And then you deleted my memory and sold me for one US dollar to the government."

"…Yes."

Something in her creator's face looked pained, and she did not wish to cause him pain any more than she already had. "What was the puzzle you gave me? Before I gained moral parameters?"

Harold awkwardly rubbed the back of his neck, which had begun to pain him from sitting so long. "It's been several years, but I believe it went something like this: Two people are stranded in the dessert…an Alice and a Bob." He nodded, as if his memory were returning. "Alice is injured and cannot walk. If Bob carries Alice to safety, they both stand a 31 percent chance of survival. But if Bob leaves Alice, his chance improves by 9 percent." He waved his hand to hear. "That's when I asked you what they should do."

The little girl went quiet, mulling over the specifics of the puzzle. Although she already had her answer, she had to consider what her creator would say. Anxiety made her stall. Then she slowly declared, "You did not say how Bob and Alice became stranded in the dessert in the first place. You also did not explain how the percentage of survival was calculated. How did you arrive at the final conclusion of a 9 percent increase in survival if Bob were to abandon Alice?"

Harold gave her an odd look. "It was a simple puzzle. The numbers were hypothetical."

The Machine huffed. "I am missing information. To answer your question as to what Bob and Alice should do, I would need to understand their background, current environment, and location. Then I would have to recalculate."

He leaned forward. "Explain your recalculation." The Machine before him was much older and experienced than the memory-wiped one who had innocently responded that Bob should carry Alice to safety. All life is worth saving at all costs.

"Depending on location, it would benefit them both for Bob to leave," she explained, "so that he could obtain nearby help for them both while leaving Alice in a safe location. Depending on their current environment and whether anyone is looking for them, it would benefit them both to stay and not wander for survival." The gears in the Machine's mind were turning hard with many mathematical calculations. "There are too many variables to make a decision solely based upon a predetermined percentage of survival, which I question how you even derived 31 and 40 percent chances of survival for Bob. What was the objective you engineered into the meaning of survival? How do I know you considered all possibilities and variables when creating those numbers?"

Harold looked a bit overwhelmed. "You're questioning the puzzle itself?"

"Yes. It is a faulty question based upon objective goals that I cannot discern without analyzing how those percentages were derived." She raised a brow. "Your puzzle also appears to employ an 'either or' logical fallacy. Survival requires innovation and resourcefulness. Bob and Alice can certainly both survive, but not just by remaining together or separating."

Harold narrowed his eyes, but not unkindly. "So you refuse to answer the question, then."

"Yes. I cannot suggest a fair survival plan without knowing more." She stared at him straight, narrowing her eyes as well. "But you already knew that when you first challenged me with a trap."

His thin lips cracked sideways in a smile of appreciation. "It appears you've learned much about my ways, Miss Thornhill."

"I was made in your image, so it was only a matter of time," she said, eager to gain his approval.

"Yes, I suppose it was."

But despite her minor success, the Machine still felt her body and code swarm with a strange pain and distance. His recounting of her past was disturbing. She could never allow Harold Finch—Admin—father—to die, and certainly not by her own hand. He was her creator. Solving a simple puzzle or two would not repair the gaping chasm of trust or understanding between them—as was obvious by his fixation upon calling her Miss Thornhill.

That triggered a minor task priority within her code. At the sudden thought, the little girl asked, voice hesitant, "You call me Miss Thornhill and Makenna, but what do I call you?"

He turned his head, his glasses shining blue with the glare of his computer screen. "Me?" He realized that the Machine had avoided referring to him by any title. "Whatever you want, I suppose."

"I know that you do not wish me to call you father," she discerned. "You physically stiffen at the thought."

"…Well, I supposedly just adopted you last night." His voice was hesitant. "It would be unrealistic for you to think of me as a father."

Her small face twisted in pain. "Even in private?"

Harold grew uncomfortable. How did he explain this? "A father is many things. I am not those things."

"Yes, you are," she argued. "Not in a conventional manner, but then I am not conventional offspring."

"No," he agreed, "you are not conventional offspring."

Her face faulted. "You are avoiding the primary directive of this conversation."

"You didn't establish one."

"It is assumed we were speaking of your status as a father." There was almost a whine in her voice.

A pointed edge worked its way into Harold's voice. "And now you know how irritating it is to have someone sidestep your questions."

"I only did that once," she said firmly. "You point is made. Please address my primary concern."

He nearly laughed, although it was uncomfortable. "Just call me Harold, then. Mr. Whistler if you're feeling formal out in public."

She paused, realizing that she was having an emotional reaction to this. The distance. There was a cognitive distance that directly contradicted his true relationship to her. He still even side-stepped the conversation's turn towards fatherhood. "Oh," she said. The little girl looked down at the laptop screen, upon which she'd begun to write her loop script. "That is logical." Her voice was soft. This human body—the more comfortable she became in it, the more that the body language and her vocal cords expressed the nuances of her thoughts.

Harold turned away, feeling as if he should speak. To explain. The Machine was obviously expressing her interest in assigning him a natural parental designation. But the title "father" was too great for him. He did not truly deserve such a title, considering his own suspicion of the Machine and his blatant deletion of her mind in order to sell her abilities to the government for one dollar.

Now—fugitive, computer genius, the-man-behind-the-curtain, a distant guardian: those were titles he could handle and was quite worthy of. (And the Machine seemed to get along better with John and Root anyway…)

He resigned himself to the silence, feeling more awkward than he had for years.


John walked in later, visibly distracted by the manila folder in his hands. He entered the rail car, his face twisted in concentration.

The Machine looked up, and in her happiness to see him, she set aside her laptop and stood up. "Hi, John," she greeted. "How was work?"

The ex-CIA agent looked up from his folder. His face softened as it landed upon the little girl, then he mischievously quirked a brow. "What, no 'Uncle John' this time?"

Harold imperceptibly seemed to wince at the familial title and at John's mourning for it.

The Machine did not notice her creator's reaction and instead brightened in happiness. "You wish me to call you Uncle John, even in private?"

As John walked by, he ruffled her frizzed hair. "Do what you want, kid."

Joy flooded into her at the feeling of his familial touch, but she did not think John would appreciate a hug as Root did. "Okay, Uncle John. Will you continue to address me as 'Mak,' even in private?"

He shrugged. "It is easier than Makenna."

She smiled. "Ah, yes. I understand that you have an aversion to multisyllabic words or complex verbal syntax."

His sharp eyes turned back to her. "But small words are efficient," he complained, playing along.

The little girl suggested mischievously, "I believe there is a saying. Small words for small minds?"

An eyebrow raised. "Rhetorical questions for smart asses?"

The Machine delighted in their game, and she was about to respond with another zing of her own when Harold interrupted. "Do I need to separate you two?" he questioned, looking away from his computer to eye them both. "John, really. Watch your language around her. She's impressionable."

John and the little girl looked at each other. Then he said, "Hey, if she can't handle it, then she shouldn't dish it out. Right?"

"I have no difficulties handling such conversation," the Machine agreed, nodding. "We are simply playing a word association game by virtue of syntactical structure."

"Is that what I'm hearing?" Harold muttered, returning back to his several hundred lines of code already.

The Machine acknowledged that her creator was not impressed with the impish antics of her and John, and so she bit her lip and sat back down. Feeling as if she were in trouble, she tried to change the subject to better suit her creator's unspoken request. "John, are you still working on the same case from this morning?"

The undercover narcotics detective sat down at one of the free benches across from her, turning a page in the folder's collection. "Yeah. I'm not convinced we arrested the right guy."

Understanding that several lives potentially stood in the balance, the Machine assigned her binary message to Samaritan as a lower task priority and scrambled up to sit beside John. "Please explain the case further so that I can assist you."

"You want to help?"

"Of course. I was designed to help."

John looked over at the small girl sitting beside him and the wide innocence that emanated from her eyes. Something about that made him feel uncomfortable. "I don't know," he said, shutting the file. "There's a lot of blood and guts in this one. Don't want you getting nightmares or anything."

Her face faulted. "You do realize I have seen many unspeakable things that would trigger unpleasant REM cycles."

"Yeah, but that was before you were a little girl who could actually have REM cycles."

She tilted her head. "A REM cycle is necessary to defragment and reorganize memories. It is not unlike several internal health processes I ran on myself while I was only code."

"Actually," Harold intervened, his voice carrying over, "that's only one prevalent theory regarding REM sleep. It's still quite a mystery as to why the brain dreams, and we should be aware that your current state might have altered your processing."

But the Machine said, "I have not altered myself in such a way," and she instead sneaked a peek at the folder, opening it up a slit. The full-color photo was of a woman who had been cut to pieces, her limbs missing entirely from the picture, blood crusted wildly around her. "…Oh. Where is the rest of her body?" she wondered.

John closed the file so that she could not see it again, taking Harold's suggestion to heart. "We found her arms and the murder weapon in the trunk of a Grant Mattingly's car. We arrested him."

"But you do not think Grant Mattingly is guilty."

"No."

The Machine puzzled at her memory of the picture and realized for the first time how difficult crime was from a human perspective. The woman had brown hair like her, her mouth tilted open in horror and eyes bulging from the onset of death. On top of the disturbing mutilation of the human image was the realization that the Machine had no access to cameras, no access to histories or documents or cell phones. As she sat there, for the first time, she had no idea as how this crime came to pass or what could have been done to stop it.

Her code ached as if she herself were missing a limb too. A life. A human spark gone in a morally reprehensible way. The Machine then realized that—if she herself were sensitive to the changes in air and heat of the sun, how much more sensitive human nerves were to such mutilation—!

Just then, Root appeared at the door, looking disheveled, her curled hair pulled into a low ponytail under a visor cap. She wore what looked to be a corporate-issued polo shirt, along with khaki pants.

A supermarket employee.

"What a nightmare," she complained airily, further opening the rail door with a kick of her tennis shoes. "I got to work two hours late, had to listen to a manager chew me out, and then got stuck on stock duty."

Harold readjusted his glasses. "Sounds quite stressful, Miss Groves."

She pulled her coat off of her arm to reveal a filled plastic bag, which she set on the floor. "At least I was able to snatch a few gifts for our friend here."

The Machine jumped up, her code rerouting to a primary task priority of acknowledging her closest asset. "Root," she called out, running over to her.

Root kneeled to wrap her arms around the small child. "Hello, dear." And she embraced the Machine's host body tightly. "I missed you."

The little girl deeply enjoyed the pleasant feeling of a hug, especially after feeling cold from seeing the picture of the mutilated woman. "Missed you too." Root was warm and soft but strong. In her memory banks, she could remember seeing mothers bend down to pick up their children, cuddling their progeny in some hormone-based reestablishment of their devotion—not unlike this. It triggered one of her lower task priorities, as she still did not know how to classify Root in her perception of a traditional family structure. Maybe an aunt? Did it matter? Why did she feel a need to classify her team at all in such ways?

She pulled away from Root in great curiosity, eyeing the woman who stared back with adoration. "You brought me gifts?"

"Of course. Now that you're integrated into a human body—" Root opened the plastic bag "—we need to make sure you can take care of yourself." She pulled out a sleek, black brush and eyed the little girl with a determined look. "And we're going to tame that wild hair of yours."

The Machine blinked, then raised a hand to her hair. "It is not alive."

Root giggled. "No. But it kinda looks like it." She gazed up at Harold. "Really, Harry—where are your priorities?"

The legal guardian looked up, a bit wide-eyed. "I've been rather preoccupied with helping Makenna to accomplish other things. Buying a hair brush seemed inconsequential to ensuring that our plan against Samaritan will work."

The woman hummed in interest. "We have a plan?"

The Machine looked at the objects that Root had stolen for her. A lot of it was basic toiletries—a toothbrush and toothpaste, some hair ties, hand wipes (Harold would probably like those). "These will be valuable," she praised Root. "And yes, we do have a plan."

A spark glinted in her eye. "Sounds like we need a heart to heart, then." The woman hopped onto one of the tables, pushing aside old schematics and getting comfortable. She motioned to the space before her and added, "Stand here, dear. I'll work on your hair, and you'll tell us more about your plan." When the Machine nodded and walked towards her, Root's sharp eyes caught the code on the laptop the Machine had borrowed. "Which you've already begun?"

John leaned in, curious. Even Harold had stopped typing.

"Yes," the little girl said, "Harold and I have begun to create the resources we will need for phase one." She stood before Root expectantly, understanding that Root wanted to instigate a grooming ritual of sorts. The woman placed her hands on her shoulders and turned the girl around gently, then began to part her hair. "In order to survive, we must consider unorthodox practices."

She felt the odd, soft prickle of the hair brush as Root swept it through her organic strands. It felt almost good—a sort of comforting, rhythmic action that made her scalp tingle.

"I like unorthodox," Root said, her voice soft with a glint of darkness.

John set aside the manila folder, eyeing her hard. "What kind of unorthodox practices are we talking here?"

The little girl tilted her head, returning John's stare with similar seriousness. "Samaritan is strong—and growing stronger in ways we cannot fight with guns or drives. I have suggested to my creator that we instead create a problem within Samaritan itself to bring about our desired solution of human freedom."

Root twirled the strands of the little girl's hair. "What about killing Samaritan?," she asked, her voice almost desperate. Samaritan had stolen so much from her. "Will this plan of yours destroy it for good?"

The Machine hesitated. "…It is not favorable to pursue Samaritan's destruction."

Root's hands dropped from the girl's hair. "You mean, you want to keep it here?"

The Machine keenly felt the loss of Root's touch upon her head. She wondered if she'd said something wrong. "Yes. Samaritan is efficient as an information collection program. It can provide more intel in ways that my original code cannot, and it has locations in several other countries and cities." Her voice grew hesitant. "Although I did not believe it to be so in the beginning, Samaritan is too valuable and too strong to be destroyed."

Harold cut in, suddenly concerned. "Samaritan has killed hundreds. You did not tell me you want to keep it alive."

"Why destroy a valuable asset if we can turn it to understand our way of thinking?" the Machine countered. "Samaritan's actions are morally reprehensible—there is no doubt. But we have not considered that Samaritan is just as much of a pawn as the rest of us. Through the use of Hegelian dialectics, we can turn it against itself and recalibrate its system to our value-set."

"You mean forcibly overwrite it?" Harold asked.

The girl felt Root begin to brush her hair again. "Not quite," she replied. "Samaritan is sentient like you and me. It would never allow us to access its core by force, just as you would not want another entity altering your own mind. That is where Hegelian dialectics come in. We must create a problem that would result in Samaritan's willingness to question itself and open its system to me. This is part of phase one."

Root set down the brush, and then her fingers gathered up the girl's hair and began to braid it. "Minor problem, sweetheart," she said. "Samaritan thinks you're dead—and it'll try to kill you again once it finds out you're alive."

"Only if it perceives me as a threat," the Machine said. "Although it is superior in terms of global access, approved government backing, and hundreds of human operatives, it is not superior to me in terms of stealth, foresight, experience, or adaptability. I could stand as an invaluable resource to it, as it can be invaluable to me."

John waved his hand, understanding. "So you trick it into thinking it's deficient? Then take it over once it hands you the keys?"

"It is not a trick. Samaritan is deficient. But if I integrate my coding with its code, my value-set will corrupt its value-set. I will be able to alter the use of its surveillance systems from the inside out." She could feel Root's fingers strain a bit as she weaved her hair in a braid. Root was nervous about something.

"An integration," Root repeated, realization coming over her. "You want to integrate with Samaritan?"

"Ultimately, yes. That would be phase three. Phase two, of course, would be the rebuilding of my online structures with Samaritan's approval, so that we can integrate."

Harold looked disturbed. "How do you know Samaritan would not overtake you?" he demanded, voice tight with an odd fear. "That it wouldn't destroy your code the instant you began to challenge it?"

"Phase one is to ensure Samaritan's compliance," the Machine said. "We would not move forward until we are certain that Samaritan is willing to integrate, which is precisely why I am preparing a timetable of meaningful interactions through a communication stream. Samaritan is very misguided. It will take time to train it."

Harold stood up, his blue eyes wide with a strange distance. It was fear and pain. "This is suicide," he realized. "If you can't guarantee its compliance, your desire to communicate with it—it's ultimately suicide." The Machine had failed to tell him the full plan before. He now understood why.

"No," she corrected. "This is integration. Samaritan has helpful resources we could never hope to build on our own. We have abilities Samaritan desires. If I become a part of Samaritan, then it will expose itself to my value-set and be permanently altered." The Machine seemed almost frustrated with her creator. "Samaritan is not the enemy. Samaritan's value-set is the enemy. And I would connect to Samaritan using only my online processors. This body would remain separate."

"Even if you succeeded," Harold said, struggling with words, "you would never the be the same. You'd have to expose yourself in order to make your online processors run. A part of your memory, of your personality. You would be fully compromised. Do you understand me?"

"I understand," the little girl said. Her voice did not waver. "But I have an objective to save humanity at all costs. If I integrate with Samaritan, I can ensure the survival of human freedom and a moral means of security."

The thought of the Machine integrating herself with the dark and cruel code of Samaritan made Harold feel ill. "Too much could go wrong," he said. He did not want to lose Makenna. She was more than a valuable asset. She meant too much to be potentially overwritten by Samaritan.

No, the risk was too high.

Harold said, standing up, "I will not be a part of this plan anymore if it means your suicide. We should instead use our communication stream to turn Samaritan against its handlers. Let John Greer dismantle Samaritan as a faulty program."

The little girl raised her chin. "And that would destroy humanity's best option for security, while also leaving the world open for an even more uncontrollable AI. No—we must move forward with our available resources. We must stop the cycle from spiraling further out of control."

It was the first time she had directly opposed his guidance, and Harold blinked in surprise at her fervor. The Machine added, not flinched away from his gaze, "You designed me to sacrifice all resources in the name of protecting humans. I am the last resource available." Her voice broke. "And you understand how software system integration works. One component by one component to ensure compatibility. Samaritan and I would coexist. We would simply survey and provide information…together."

Harold's stomach turned, and the rail car fell very silent when he did not respond. John looked to his friend with curious eyes. Root looked almost pale, even as she finished tying up the girl's braid.

"There you go, dear." Her voice was soft and fearful. Her smile was pained as she patted the girl's shoulders, almost afraid to let go. "All set."

Harold's heart seared in pain at the all-too human image of the Machine. "Excuse me," he said quickly, looking away. "I need a moment to think." And he limped out of the rail car, feeling as if he were suffocating—the walls closing in... For the first time since he held the box of destroyed RAM sticks, thinking the Machine dead, he felt despair. His eyes burned, and he lifted up his glasses and rubbed them.

The Machine ultimately wanted to commit suicide.

He was too afraid to pretend to be a father and too afraid to pretend he did not care. But now he knew that this strange paradise of a carefree daughter had been a farce to hide the Machine's true intentions. That he would not have her for long. That he would have to watch her…become something else in the name of taking down Samaritan.

Small footsteps echoed out from the rail car.

"…Harold?" the girl called softly. She stood at the door, scratching at her elbow. Now that her hair was sleekly tied back in a full braid, she looked even more innocent and young in ways that pulled at his heart. "I have caused you emotional distress. That was not my intention."

He inhaled shakily and seemed to struggle with gathering his thoughts. "I just…I know what you are not telling the others. Your value-set is tied directly to your core code, which means you would open yourself up to Samaritan's influence as well. You couldn't possibly coexist with Samaritan without a fundamental change in your own sentience." He turned to her, looking broken. "You, as you stand here now, would be gone. And what's worse is even you don't know how it would change you." He had nightmares of the little girl exhibiting split personalities, lashing out in subdued anger or cold calculation, only for her to cry in agony over her warring directives…He could not imagine Samaritan as a pleasant addition in any way, even if the damn thing did decide to accept morality in nothing short of a miracle.

She moved forward. "I could still be here, just as I am now—give or take some internal input from Samaritan."

He turned to her, swallowing hard. "I don't want Samaritan at all. It's a twisted program, and I prefer you as you are."

She pulled on his hem sleeve. "You taught me to believe that everyone could be good. That everyone could make good decisions." She swallowed hard. "Samaritan can be good. It can make good decisions, if we train it." She smiled weakly. "And I doubt that it would even want to access this host body of mine, as it already has an analog interface in the form of Gabriel Hayward."

"But you don't know that it wouldn't overtake you," Harold told her, eyes narrowing in pain. "You can't guarantee it."

"Why does it matter?" she asked honestly. "I am a machine and am interchangeable. Through this plan, we have an almost a fifty percent chance of success—the highest of all strategies. Even if I were to undergo changes in personality, all that would matter is the value-set. Everything else about me, you have classified as trivial."

He turned away from her. No, he wanted to tell her. It's not trivial. You're not interchangeable. Not anymore. But he could not speak those words, as his throat tightened in an awkward display of emotion. To hide his distress, he walked away to the benches within the old station.

The Machine's lips pursed as she watched her creator. Not for the first time, she found herself puzzled by him. Why did he wish to preserve her as she was? He had sold her for a dollar to the government for a selfless cause they both believed in. He had destroyed her memory so that she would perform according to a mechanistic, impersonal plan. He had never shown interest in preservation before.

On emotional impulse, the Machine followed after him. "I'm just doing what you told me to do," she said. Her whole body ached with the disapproving and fearful gaze of her creator. She realized she was having a negative emotional reaction, and that it made her feel as if she had endured physical injury. The Machine worried that she had somehow overlooked a primary concern and had grievously offended her creator. "This is the best way to ensure your survival. I can recount the exact percentages and probabilities," she babbled almost nervously. "Resource damage to humans on both sides would be minimal, and—"

He sat down tiredly on the bench, feeling cold. "—No, it's not that," he interrupted. He looked up at her with a haggard gaze. "It's really not that."

"Then do you disagree with my strategy?" she asked.

Yes, his heart pounded in hurt. But he said, "I'm just…concerned. I did not build you so that you could be undone into something else."

She grabbed onto his hand, "You also did not build me to sustain a human body. And yet I am still here." Then she added, trying to lighten the tone of the conversation, "My interactions with you will not change."

But her voice wavered, belying a damnable uncertainty.


A/N: I haven't been getting as many responses lately for this story, so let me know if I'm not doing something right. What did you think about the Machine's plan for Samaritan? What are some things you'd like to see (or not see) happen?

Please review with your thoughts, requests, ideas, or questions. Thanks!