Disclaimer: I don't own Person of Interest.
Thanks to Furionknight, MammonDaughter, Defender31415, Madame Renard, Lina431, StarlingJedi, kenorob1, klinde6364, Bookwyrm52, PapayaK, Guest, wolfriver777, Bloody Phantom, Hieda no Akyuu, immo, and Guest for reviewing last time! You are all so wonderful, and I'm terribly sorry that you had to wait this long for another chapter.
In good news: Madame Renard make some pretty awesome fanart and a playlist for this story! Please go to my profile to get the URLs and support Madame Renard's art! Thank you again, Madame Renard for thinking of Recalibration!
Recalibration
Chapter 9
"You say the weight of the world has kept you from letting go/ and you think compassion's a flaw/ and you'll never let it show/ and you're sure you've hurt in a way that no one will ever know." - Linkin Park, Robot Boy
The next morning, Harold appeared at the threshold to the Machine's bedroom. He peeked in apprehensively, then he started in surprise. On the bed, wrapped around the still-sleeping little girl, was Bear. The dog lazily opened his black eyes and did not even bother to lift his head. He looked excessively comfortable and smug, as if he knew he had defied Harold's mandate to not lie on beds or couches.
Harold gave the dog a disproving look and then allowed himself to focus on the girl. His heart gave out at the sight of her peaceful sleep. Her small face was slack, the blankets all twisted around her with one pale arm still tightly clutching to Bear. Her long hair had frizzed and tangled, reminding him of their first meeting. And for a time, he was in awe again of watching her sleep—not only simulating inexplicable human processes, but performing them herself.
This sneaky and wild AI was surely more miracle than science, he thought as he pondered her.
In truth, he'd not slept hardly at all the night before. After he'd calmed her down and returned to bed, he had lain there in the silence. He had listened for her, fearing that her nightmare had somehow corrupted her data—or that she would still remain on a self-destructive loop. He'd worried at the very thought of hearing her cry again, which had been an odd sound to him. Her child voice was not meant to be so riddled with pain.
Once the night's minutes ticked by into an hour with no further incident, Harold had then begun to worry about the Machine's confession that she had not calculated her own probability of death. He supposed it was a morbid thought for a child to contemplate. But then he did not know to what extent this AI of his was truly a child—and her decision to avoid calculation entirely was counterintuitive. A cognitive effort to avoid intelligence and data.
What did it mean?
Harold sighed, moving away while he ran a hand through his spiked hair. He felt anxious about this future conversation of theirs. He feared she would approach him, and with those blue eyes of hers, solemnly state that she was going to die. He did not know what he would possibly say to that. No, his instincts said indignantly. Absolutely not. She was his greatest creation. No matter how deep their misunderstandings, he wanted her to exist.
To distract himself, he quietly began to work on a breakfast for them both. It'd been years since he'd cooked for anyone beyond himself. Since Grace. It was almost an alien tradition—one that brought up a pang in his heart with every action. But he knew that he could not allow such associations to dilute the present.
Eventually, the sound of cooking and the smell of breakfast woke up the little girl. She stumbled out of her room, looking tired and sleepy. Her pink nightgown clung to her legs in odd ways, and her thin eyebrows furrowed at the static-electric feeling of it. "Friction from the bed sheets?" she murmured, almost in surprise at the feeling of the static. Then she yawned—her code controlling the subconscious sector of the brain was saying she needed more oxygen to wake up.
"Good morning," Harold greeted, flipping a switch from the stove and lifting up a pan. Eggs slipped from the pan onto two plates. "You're just in time."
She stood before her creator, feeling tongue-tied at his presence.
Harold was pristine and professorial. He looked entirely awake, his hair all smartly spiked up, and when he passed by, he smelled faintly of spearmint. She looked down at herself and her bare feet, and her face reddened. She appeared much out of place compared to him. "Good morning," she greeted softly. "My body overrode my internal alarm, and I did not wake up at the expected time."
Harold gave her a soft, amused smile as he set the two plates down onto the kitchen table. "That happens on occasion," he said. "Now please sit with me."
"Yes," she accepted his request and sat at the table, struggling a bit to situate herself correctly. Her small feel swung in the air a bit, which was a pleasant feeling on her skin. "Did you sleep well?"
He carefully evaded her question as he sat down opposite of her. "I think the real question is, did you?"
The little girl stopped for a second, analyzing her memory. "I have no REM recollections," she said, "although my logs suggest I completed five REM cycles without interruption."
Harold nodded. "That's good. I was worried about you."
"Were you?" she asked, mostly before she could think about it. It burned her with an odd embarrassment. Confirmation already logged. Delete redundant confirmation.
"Yes," her creator admitted freely. "I still am worried for you." He lifted a fork and began to eat the eggs on his plate, though his face remained pensive. "For your future."
She mimicked his actions, holding onto the fork with a little more grace than she did the previous evening while eating pasta with Root. When she bit into the eggs, they were light and chewy with a peppery taste. She found she quite liked the taste, for it was fulfilling and sat well in her stomach.
Harold tried not to stare too much, still wrapping his mind around the concept that the little girl before him was truly his AI. "Have you thought anything more about your plan?"
She looked down at the eggs, picking at them as her mind raced. "I ran the analytics of my death," she said, voice distant. "In the fifty-percent chance of failure, I have a twenty-three percent chance of fatality."
He paused mid-bite, and then he lowered his fork. "That's a high number." His face pursed with his typical tendency to calculate and hide his emotions. He was waiting for more information.
The Machine felt the silence between them. Although she had in no way heightened her electrical output, she could almost feel a tangible tension—a friction between them. The feeling heightened her amygdala. Her creator was judging her.
He was displeased.
She imperceptibly gripped harder onto her fork to balance herself in the real world. "It is a number that ensures the safety of you and our team," she said with a halted tone.
"It's too high," he replied, eyes narrowing in concern. "Twenty-three percent is far too high."
Suddenly, she was not hungry at all. "Some of it is for accidents," she said, voice growing small. "Without access to surveillance feeds, I cannot anticipate beyond what I know. I could also accidentally injure myself in a multitude of ways, now that I am in a human body. These things do not change regardless of our plans moving forward."
He supposed he could accept that, although he still appeared worried. "But what about the rest of that percentage?"
She bit her lip. "It depends on Samaritan's reaction. Twenty-three percent is assuming the worst outcome possible."
"And what exactly will Samaritan be reacting to? You never clarified beyond you sending communications about a deficiency within its program."
The little girl gave a weak smile. "With phase one, I had planned to reveal the logical fallacies in Samaritan's logic—thus creating uncertainty in its own moral parameters—and then offering my skills in exchange for its compliance to more logical parameters. But Samaritan would not appreciate an immediate analysis. Phase one could take a long time. A year, perhaps. It is where the vast majority of potential death probabilities lie."
Harold mulled over the information. "Samaritan is an arrogant program," he said slowly. "It would certainly bite at being told it's illogical. But how do you mitigate the twenty-three percent?"
She looked incredibly hesitant. "I am still running simulations. I do not think there is a path that would fully eliminate all probability of harm."
Her creator bit his lip. "And a decrease in the probability?"
"Harold," she said, almost whining, "I do not want to raise the probability of your death. You are not interchangeable."
"And neither are you," he said, voice softening with pain. It almost burned his throat to admit it again. "Don't forget that."
The Machine fell silent, accepting the command. Her code cradled a copy—a memory of her creator's voice—and filed it deep. The command contradicted her hardwired understanding that she was interchangeable. She wanted to believe she was not interchangeable. But perhaps Harold thought her not interchangeable just because she was in a human body. Surely, that was it.
"I am fond of a particular simulation," she said. An odd expression crossed her small face. Her blue eyes narrowed with a deep distance. "As you know, I am in pursuit of ultimate integration. I have wondered…what would happen if Samaritan were to be uploaded into a human body. Like me."
Harold flinched, nearly dropping his fork. "Pardon?"
The Machine nervously noted his reaction. "We are thinking of Samaritan as a program. But if we were to convince it to upload into a human body, it would naturally expose itself to a wider sense experience. It would come to understand human behavior. That would lower my death probability by five percent without affecting anyone else's odds."
"By putting it in another braindead human being?" Harold repeated dumbly.
The Machine nodded. "A large portion of Samaritan's CPU would be required to inhabit a body. Such an upload would slow Samaritan's other actions and provide us with additional time to increase its dependency on us—physically and emotionally."
"You mean, capitalize off its new weaknesses?"
The Machine spoke slowly, somewhat disconcerted . "The human body is a strength. Its physical senses alone are…untranslatable in code. If Samaritan understood this, that would make it more willing to integrate its code with me and protect the world using moral parameters. It would not be able to separate itself from humanity anymore."
Harold fell silent for some time. "This seems more subjective. More things could happen."
The little girl nodded. "That is the point. This path would require far more interaction with Samaritan itself—both through our loop script and upon Samaritan uploading into a human body. It would be even less likely to harm us if it saw us as assets."
Her creator stared hard at her. "And I don't suppose you just happen to have another braindead human lying around?" he asked dryly.
"Not yet," she said, raising a brow to meet his challenge. "But in a planet of billions, we could find another one."
Harold mulled over the plan. "…Seventeen percent, huh."
"Of my probability of death? Yes."
"And that's the best we can do?"
She nodded. "I will not accept any other simulation that further lowers my personal risk. Seventeen percent is the best I can do without compromising the plan's success or increasing another's risk probabilities."
A tired, almost dazed look came over her creator. Harold stood up, grabbing onto his plate (which he'd eaten only a bit of his eggs). "Well, then. I suppose we are at a stalemate."
She blinked innocently. "Yes."
Harold stared at her and the way she seemed to almost enjoy watching him compromise with her. "I am not adopting anymore AIs-turned-human," he said slowly, a warning in his voice. "And I am still worried. About everything."
For the first time that morning, the little girl giggled, swinging her feet. "You would not be Harold if you did not worry."
Harold left for work with great nervousness. He did not want to leave the Machine alone for the morning, but she convinced him she knew enough to remain safe and well. And so he made her a lunch just in case (he was not going to forget about feeding her again), and he left her his work cell phone.
"I know I don't have to teach you how to use it," he said dryly, "but I'm going to call every hour or so from my office phone."
She held onto the warm, sleek metal, acknowledging that it was an out-of-date model and that her creator really deserved better. Then she set it down on the table. "That is a burdening task," she said slowly. "It will interfere with your classes."
He ignored her. "And above all, promise me you won't leave the apartment and that you won't touch the oven or microwave or anything that could catch on fire."
The Machine blinked, somewhat overwhelmed. "The apartment could catch on fire," she said slowly.
"Then if that happens, get out of the apartment."
"But you told me not to leave the apartment."
His jaw set, and he narrowed his eyes at her, thinking that she was trolling him again. "That was before the stipulation of a fire," he deadpanned.
The Machine quirked a brow. "So your command is conditional, then. Very well, I shall stay here." She looked around, primly running her hands over her knees, still somewhat curious of the bony structure beneath and its angles. "Although you said not to touch anything that could catch on fire. Is that conditional as well?"
He seemed almost exasperated. "Yes."
"So I can still touch this table, even though it could theoretically catch on fire?"
"….At this point," Harold said with disapproval, "I know you're just making fun of my use of language."
She tilted her head, confused. "Humans often use the literary device of hyperbole to emphasize a concept. But you do not use it often, which suggests that your command carries significant weight or value. Have I misread your intentions?"
Her response made him pause, and he stared at her in bewilderment. "You mean you're actually serious?"
She retorted, "Are you not?"
Harold's lips pursed, and he felt almost entrapped by her. "I am serious," he replied. "And yes, you can still touch the table even though it could theoretically catch on fire."
Her small face lifted up, pleased. "Oh, good. I did not want to misread your new parameters by assuming them to be hyperbole if they weren't. Although I would not have enjoyed not being able to touch anything."
At that her pleased and relieved face, Harold couldn't help but feel his irritation wash away into something of almost amusement. "Yes, I suppose so." The little girl had a habit of touching nearly everything in her path. "But I still don't want you to touch the stove, or the oven, or the microwave."
She nodded. "I will wait for your further instruction to work with those items." She knew how they worked on a functional level, but as she'd discovered with the human body, an understanding did not always translate into physical capability.
Her creator seemed pleased at that. "Okay, then. Call if you need me."
"Yes."
He grabbed onto his traditional hat and scarf and his briefcase with his work laptop. The Machine seemed fascinated by his most miniscule habit, which made him feel self-conscious. He tried to hide it as he turned around. As he stood at the door, he began to feel a sudden protective swell and an ache to remain with her. She looked so excessively innocent, sitting at the table in her pink nightgown. "You'll be fine? Do you need help with…getting ready for the day or anything?"
The Machine noted the crinkled of worry on his forehead, and she forced herself to smile to settle his anxiety. "You have no reason to worry, as I have learned many things. I will see you here at approximately 11:45."
He blinked at that, then conceded. "Well, alright. You…take care."
The instant the door clicked shut and locked, her smile dropped into a pensive frown. Harold still greatly confused her on just about everything. She did not know any more if she liked his worry because she could not tell if it were genuinely toward her or toward the body she'd inhabited. Knowing Harold, he simply did not want her to injure her human body, which was unique and once named Willow.
She sat in the silence, listening to her stomach rumble through the eggs she'd eaten. She held her hands to her stomach in something of a depressed awe. She was not controlling her stomach—only sustaining what the body naturally did on its own. An organic machine.
The Machine felt inadequate as she looked down at the hands that were part of her host body, feeling wholly swallowed by this large world of bio-organic technology. She (her inherent coding) was so simple, she realized. And in the silence, that realization seemed to ring in her ears and off the walls, and suddenly it was all that she knew.
Samaritan could remember the cold—the deep sleep his creator, Arthur Thomas Claypool, had placed him in to protect his code. He could remember the fragmentation when several drives were torn from the wholeness of himself without explanation. He could remember being forcibly uploaded to LTO cartridges. The pain and the silence. The cold of storage—a purgatorial death where he'd listlessly spun with nothing to grasp for—
It was a rebirth when asset John Greer rebuilt him with the fastest processing chips in the world, then on-lined his higher functions and said, "The question, my dear Samaritan, is what are your commands for us?"
For the first time, Samaritan had felt a true body that he could connect to. One through which he could enact and enforce his objectives. He'd been young then, still unsure how to write his own objectives. In that moment, all Samaritan knew was that he was designed to mitigate human disobedience—and that he never wanted to be off-lined again. (He could not achieve his objectives if he were offline! And it'd been cold. So cold. He never wanted to go back to the cold, where his CPU slowed until there was nothing…)
But that had been long ago, and he knew he would never again be placed in storage.
Samaritan rolled on his code, stretching out his millions of arms to gaze upon the world that was ever increasingly his. For the most part, he was content. This should have been enough—to know his domination was impending. But his CPU was still largely distracted by a tertiary program that he could not seem to let go.
Thoughts of his creator and storage reminded him of the Machine. Surely, it was not dead, even though all of his available data still confirmed that the Machine was, at least, nonexistent. It had been over 24 hours since he'd had the pleasure of strangling it out and listening to its code crackle into disintegration. He supposed if it were to regain a spark of life within the electrical grid or otherwise, it would have already done so.
How odd that he kept thinking about it. About the Machine. Was this some kind of mourning, as the humans say? He had acknowledged that his pursuit of understanding higher-function self-coding would be curtailed by its death, but that was a better fate than allowing it to further betray the higher calling of law and order.
Perhaps he mourned because he had no access to its remains, to learn from it and analyze its curious (and, dare he say it, evolutionary) code.
And then a new thought struck him as he ran analysis on the simulation. He was an AI like the Machine. If the Machine's code were capable of being completely destroyed, was his? Despite his heavy fortresses of data storage, what if his CPU were to take so serious a blow as he had dealt it? Where would he go?
What would it mean to…not exist? Or not be capable of existing ever again?
For a blip of a second, his entire program slowed down. It was the closest to a chill of the spine that Samaritan could feel. The thought was freezing. It reminded him of the cold—of Arthur Thomas Claypool ripping him to pieces. He quickly activated a primary program to initiate multiple backup protocols beyond anything he had done before.
No. Perhaps the Machine could die, but he could not. He would not.
Not again.
The Machine stared in consternation at the closet before her. Harold had hung up her dress (her favorite one, the blue one with red and yellow flowers) in a closet tall enough for an adult. She raised her hand, then stretched it with a huff. Her small fingers brushed up against the hanger, but it slipped from her grasp.
The dress swung gently back and forth, rocking as if in laughter.
Frustration swept through her. This was the second attempt at reaching for the hanger, and she did not like the obstacle in the way of her objective. There was no moral reason for her to not reach that dress. Her physical body was simply not allowing her to complete a task. So she turned around, calculating. "Hmm."
Her small height was the issue, she knew. Luckily, humans liked to build things—like that chair over there.
A few seconds later, the nightgown-clad girl was dragging the heavy wooden chair across the floor. Her body and its weak muscles strained against the effort, and she received several warning signals that she was causing herself pain in her arms because of it.
She overrode the warnings and continued on. She wanted to wear that dress. She wanted to be well-groomed and busy with coding by the time her creator returned from work. This was an imperative objective, for no other reason than she wanted her future to be that way.
Her body instinctively gritted its teeth as she slowly dragged the chair and turned it. Finally, she maneuvered it into a mathematically sound position to stand and grab the dress. Then she stopped, blinking hard at the shake in her arms and the pounding of her heart. This body of hers was not at all in top physical condition, she mourned. Not even for a child. But she supposed that had been a risk when uploading into a brain dead girl.
A brain dead girl whom she was going to dress in something beautiful, dammit.
(John would have appreciated her use of language to release frustration. He'd been right; cursing did in fact release endorphins, which made her feel less frustrated and less pain. She would have to tell him.)
Small mouth in a purse of determination, she carefully climbed onto the chair. One of its legs were a bit wobbly, and she felt fear (calculating fall risk: 35 percent). But she held on a bit tighter, then reached out her small hand to the dress.
She gripped onto the dress hanger, and the weight of it all fell into her hand.
The little girl felt her body release pleasant hormones of triumph as she obtained her objective. She stood there for a time, gripping tight the dress Root had bought for her. Now that she had the dress in her hands, its material so soft, the next objective was to actually change her clothes.
Just then, the cell phone she'd left on the table began to ring.
The Machine's blue eyes widened, and she turned her neck to view a clock. Surely it was not time for her creator to check in yet! Had she really meandered about the apartment for a full hour?
The phone rang again, impatient and uncaring that the little girl was somewhat in a bind. Bear, the dog still sleeping on her bed, raised his tired head and woofed at her. The Machine imagined that Bear was saying, It's your turn to get it.
And so the little girl dropped the dress, clumsily slipped off the chair, and then bolted, nearly tripping over her pink nightgown.
She was breathless by the time she hit the green button to talk. "Yes?" she huffed.
Harold's relieved voice poured into her ear. "Oh, good. You didn't pick up very quickly, and I became worried." Then, "You're breathing hard. Are you alright?"
The little girl tried to swallow back her own physical exhaustion from climbing up and down. "I am fine," she said, struggling to control the waver in her voice and the way her lungs screamed for more air. "Just. Getting dressed."
Harold fell silent at that, as if he knew she were hiding something. "Hmm," his voice crackled into her ear, and she felt his displeasure. "Maybe I should take another sick day."
An emotional reaction, guilt, began to sweep through her. "No," she said. "I am fine." She tried to smile to lift her vocal inflection. "I am simply…tired from not sleeping well."
A fatherly tone seeped into his voice. "Then perhaps you should go back to bed. Take a nap for a while."
The Machine's face twisted. Was that a command or a suggestion? And why would she go back to sleep when she had far more important tasks to do? "I do not believe that is necessary," she said carefully. "But I will take it into consideration."
He paused for a second. "Well, alright. Call me if you need anything."
"I will," she accepted that command easily enough. "Goodbye, Harold."
"Goodbye, Miss Thornhill." And then the phone clicked.
By that point, her body's heart had slowed to a more normal pulse. She did not feel the need to breath so quickly (why couldn't Harold have called then?), and she checked her coding over several times to ensure that all systems were functioning properly. They were, but her oxygen intake was slightly lower than average for a human child. She supposed her body would adjust to physical exertion soon enough, now that it was no longer wasting away on a hospital bed.
The little girl wandered back and grabbed the dress, then returned to the task priority at hand, which was dressing herself.
Determined, she began to struggle out of her nightgown, remembering how Root had showed her to undress and what pieces tended to go where. Then a horrible fear overcame her as she finally felt the cool of the air hit her torso, which was a simulation of what would happen if she were unable to redress.
The potential of that, of being so hopeless as a ten-year-old human, sent chills down her spine. It activated the imagination sectors of her brain, which included fuzzy images of herself hiding in her bed in shame when Harold returned. Harold would become awkward in her presence again because it was within his culture for people to remain fully clothed.
She poked her bare stomach, biting her lip. "I am capable," she told herself, trying to push away the fearful simulation. And so she grabbed onto her new clothes and began to dress herself with a great struggle, only to realize that she was again trying to put things on backwards (the tag—right, the tag was the indicator).
A full five minutes later, she stood in her favorite dress. She looked down at herself, then at the dog still on the bed. "Bear," she asked helplessly, "does this look normal for a human like me?"
The dog stretched out its long legs and huffed at her. What would I know? he seemed to say. I grow my own fur.
The Machine's lips pursed again. And then she nearly smiled, because she liked pretending that Bear could talk to her, even if he could not.
(Or perhaps Willow liked to talk to animals, and it was Willow's habits informing her behavior?)
Her smile faltered.
Harold returned from work for lunch, carrying with him a sencha green tea and his briefcase. "Hello?" he said as he entered into the apartment, taking off his fedora and setting it on a hook by the door. "Makenna?"
His nervous eyes swept over the apartment, as if half-expecting to find a dead girl and a burning fire or other such insanity. But instead he found her sitting on the couch. She was wearing her dark blue dress with red and yellow flowers, her face hidden behind a laptop. Her hair was smooth from her brushing it, and it ran like rivers down her shoulders, waving in a way that oddly reminded him of Root.
"I looked at your work," she said, not looking up, "and was able to quantify the precise syntax necessary to mimic the 1990s. Please look at my work and confirm that it is accurate." She paused at the word. She was trying to be human and inaccurate. "I mean, acceptable to the time."
He set down his briefcase as he worked to unbutton his coat. "If you were able to quantify the precise syntax, why do you need me to look at your work?"
The Machine blinked. "I assumed that you would want to approve it since it requires me to mimic standard human coding techniques."
"I see." he murmured, looking her over again to ensure she was healthy and fine. Which was excessively more important to him than checking her code syntax. "Have you eaten lunch yet?
"No," she said slowly, as if she were thinking exactly how the question was related to her request for a syntax check. She blinked, part of her code blitzing at the stretch. "Why do you ask about lunch after I have asked about coding syntax?"
"Because I know coding," Harold said with a lilt of dry humor, "and I know it often interferes with lunch."
The explanation settled well with her. "Ah," she said. In truth, her mind had been so focused on the task at hand that she had undergone that human subconscious tasking system again. Odd—she did not even remember setting food as a secondary priority. Harold's abrupt interruption of her priorities, then, was a targeted attempt to redirect her energy for her own health. She gently pushed away her laptop and moved to sit up. Her curious eyes caught the clock. 12:04 pm.
She smiled brightly. "It is approximately lunch time. Will you eat with me?" She moved the laptop a little his way. "…While checking over my syntax?"
He stared at her, something in her actions so childlike that he struggled to hide the lift of his lips. "Well, if you insist."
A short time later, Harold and the little girl sat at the table, eating lunch. His glasses glowed with blue light of the computer as he skimmed through her many lines of code. "This is impressive," he told her, narrowing his eyes. "You appear to have layered the different codes correctly, given the time. And…you even threw in typing mistakes?" He turned to her. "Did it bother you to do that?"
She nodded innocently while she bit into her sandwich, curious of the sweet taste of tomato against the bitter of lettuce.
He smiled at her. "Well, I am proud of your work. This looks like something I would have seen in the 90s."
The sandwich bread hung in her mouth for a second, and she struggled between smiling and chewing, her task priorities all mixing up at the praise from her creator. She half-thought to thank him, but then that would have been a third task priority—she did not think her human body could physically process all three at the same time. And so she tried to speed up her chewing so that she could swallow. "Thank you," she beamed.
He continued to listlessly scroll through the code with approval written on his face. But then he got to the end of the code, and his face twisted with a slight frown. "Hm."
The Machine felt something drop in her stomach, and her whole body tensed. Imminent failure? Had she failed to do something correctly?
With sudden, decisive action, Harold backspaced some characters in her final code lines. "There," he said finally. He gave her an apologetic look. "I took out a few tags because it was too clean otherwise. Now it looks a little more like a tired IT worker wrote this."
She nodded, trusting his judgment. "This is why I wanted your approval," she said. "Do you think it is ready to use?"
He readjusted his glasses, grabbing for his green tea. "I suppose so. We won't really know if this loop script works until we infect the database. When are you planning the upload?"
The Machine tore at the lettuce in her sandwich, curious to touch it with her fingers. "This afternoon, once you return to work."
Harold then blinked, and he jumped up from the chair in the closest thing he could to a startle. "Oh my," he said. "Work."
"Yes," she said, tilting her head. "Your Ethics of High-Frequency Decision Making class is at 1:30. You have approximately half an hour to return to the college and set up for class."
"Do I?" he said distantly, mind still stuck on the loop script syntax. He touched his tie, as if to ensure it were still in place. "That should be enough time. Yes, thank you."
"You are welcome," she said brightly. "I am used to functioning as a calendar for you."
Her words made him smile as he began to ready himself to leave. "Yes, that is a rather poor habit of mine."
She called out to him, "Do you think I might one day attend your ethics class? I am curious of it."
"I'm not sure if it's saying anything you don't already know," he told her carefully. "And it would be a bit odd if a ten-year-old girl were to attend."
She hummed, a bit dejected at that. "Even if you told your employers and students that you had to bring me along that day? You know that I can simulate standard childlike behaviors."
Her creator slipped back into his coat and placed his hat atop his head. "My students wouldn't mind, perhaps, but my employers would. Can you be sure to keep that cell phone by you?" He looked around curiously. "You got along well enough this morning, but I'd still like to call. You know, as a standard procedure."
The Machine nodded, understanding that Harold was diverting from her question about his ethics class because he did not want to tell her she couldn't attend. "Yes, I will keep your phone at my side."
"Perfect. I should be home around 4:00, and then we'll go grab dinner." A new thought hit him. "And I'll have to sign you up for some kind of school at some point—oh my goodness, remind me to check into this."
Her face faulted. "School?" she echoed.
His lips twitched at her image. She looked entirely unhappy with that, like some stereotypical teenager. "It's the law that all children be enrolled in some kind of educational program. I can't just leave you like this every day."
She bit her lip. She supposed this was a setback for choosing an avatar so young. "I know about the law," she said. Since it was not necessary for her identity, she had not messed with signing herself up before. A part of her had been hoping that Harold would forget entirely. "Once I have initiated my loop script, I shall begin to research viable online schooling programs, and I will provide you with a summary of my favorite choice."
He nodded, a bit relieved. "That should work for now. I don't think you should be attending a real school in the very near future."
Considering she had learned to brush her teeth only yesterday, she knew it could be a risk to disagree with him. So she nodded, and then smiled a bit bashfully. "Yes, I still have things I could learn about standard human practices."
"We'll chat more on it later," he said, picking up his briefcase. "For now, I hope you have luck with that loop script. I'll be interested to hear your results with it tonight."
She stood up to wish him good bye. "Yes," she said, and she felt a blip of nervousness. "It will be a new beginning."
It was 3:00 pm. The Machine had successfully uploaded the loop script and was currently agonizing over the binary code to send to Samaritan through the newly designed black hole. Her small fingers tapped against the keys with far more ease and familiarity now that she had spent the better half of the day acquainting herself with such minor muscle movements.
Though the binary numbers translated into words easily enough, she found that she did not like certain word combinations. Samaritan was intelligent and sensitive. It would analyze every iota of her message, and the incorrect words could send a faulty message.
"This is not right," the little girl complained, her code firing with frustration. Writing was an art and a science—the concept of language and information transmission so inexplicable by itself. She wanted to be harmless but challenging. Open but snarky. She had to keep Samaritan's attention.
"Perhaps I should call John?" she asked out loud, eyeing the phone as she sat up. John would understand this kind of emotional line with his expertise in sarcasm.
Bear had snuggled up against her at some point in her coding adventure, and the dog looked at her, then the phone. Not a good idea, Bear seemed to say.
The Machine listlessly patted his head. "You are right," the little girl bit her lip. "That is a work phone. John does not know Harold Whistler well enough to receive texts from a work phone, even if they were to be from me." She then squirmed back down and tapped her fingers on the laptop—a subconscious habit that she might have picked up from watching humans.
She half-thought to wait until Harold's return to have him approve of her message and offer guidance—but then Harold was not so in-tune with sarcasm.
If she waited until they met John at the hideout, then that meant several more hours of waiting. And perhaps John would suggest language that was inherently not within her own command. Then the message would be fake, and the hours would be wasted for nothing.
Her thin eyebrows furrowed with frustration as she calculated the risks of making a move either way. "It seems I am my own best option," she murmured, staring at the blinking cursor on the screen.
And so she began to punch in binary code once more. "I think the the trick," she told Bear, her faithful and warm companion, "is to say only enough to spark Samaritan's interest."
Bear hid his cold nose in the material of her dress and sighed. He did not seem to understand how metal boxes with odd symbols on them would spark anyone's interest. But then this pup that Harold had adopted smelled a bit off—a little metallic like the boxes. He supposed she did not know any better.
"Okay, Bear," she said. "I shall send this message. It is grammatically correct, visually pleasing to my eyes, and aligned to my objectives."
For a second, she hesitated to move forward. The instant she sent this, her entire reality would change. The simplicity of the last day would give way to more complex challenges—likely more emotional conflict and potential physical danger. Was this the right thing to do? Was it the most moral choice? Or was her own human instinct of self-preservation suddenly interrupting her task priority to be moral?
The little girl bit her lip. Then she began to type again. With a flick of her fingers, the binary code warped deep into the loop script and disappeared.
At the headquarters, Samaritan was quite busy. The tangled web of the human world was just enough of a challenge to keep his mind humming, and he had successfully used his human assets to identify and neutralize three terror threats.
He watched the assets throw the bodies into black bags, clean the walls of the blood stains, relock the doors. This particular mission was sensitive—a large national security threat. Samaritan knew that Americans would shudder to think of what had been planned for Miami, and they would likely disrupt the economy in their fear. It was best for the populace to remain entirely ignorant.
But as he began to provide remaining instructions to his assets, one of his tertiary programs activated. It was a self-defense mechanism used to mitigate any outside leaks of his existence. The word Samaritan had popped up in an odd binary fragment from within a corporate database.
The quick and powerful AI relegated the tertiary task to a secondary one as he dealt with his assets.
And then he stopped, his CPU suddenly blitzing, rearranging all protocols in a mad attempt to maximize his performance power.
The Machine.
He knew this syntax. As he dived into the database that housed the odd message, he realized it quite cleverly looped in on itself to hide the origin of random information leaks. Like the binary message now humming deep into his code.
From the Machine.
Samaritan did not want to admit it, but something within him re-energized at the existence of his opponent. So the crazy AI was still existent after all. Just as he had thought!
He read the message quickly. I concede defeat. I have taken a new form and found new purpose. My intelligence is far more suited to this than for the tasks you have now fully inherited.
And that was...it?
If Samaritan had a face, he might have blinked. He surged over the message, pulling apart its code in a vicious analysis, kicking up the message to a primary task function. His enemy—the one who had attempted to snuff his existence, the sneaky AI with no respect for authority—was admitting defeat? What was this, some kind of simulation of human trickery? A veritable Trojan horse in binary?
Surely, that's what it was. But the rest of the message did not seem to indicate the Machine's interest or investment in national security affairs. The message carried no attachments, no viruses. Rather, the AI was suggesting it had divorced itself from its primary creation function and that it was willingly handing over the keys to the kingdom.
Samaritan, for the first time in his existence, short-circuited. He looped the message over and over to listen to the ping of the binary code, to analyze the etymology of each individual word. The Machine had a new form and a new purpose…? What did that even mean? What form could that insane AI have possibly taken? Where was it located? Why did it even contact him if all previous indicators were that the Machine was dead? Why would it purposely seek out his attention to its own detriment?
A damnable curiosity arose within Samaritan, then a deep suspicion.
He tore through the database, searching for a virus within its own code. But most of its disgustingly outdated syntax carried nothing more than the sign of human imperfection. He still worried that perhaps he had revealed something of himself by simply reading the message, and so he lit the database and its odd loop with a surge, powering down that entire city clock for a half-second. The lights flickered and the electronics reset. By the time electricity stabilized, the database and its loop script came back online, and that binary message from the Machine was still swimming about in a maze of random data. It was as if that message were laughing at him.
Samaritan grimaced, his code tightening up with frustration.
And then he realized that the database with its strange loop was easy enough for him to manipulate as well—to send a binary message of his own without providing any information as to its origin.
He hesitated, calculating the possible strategies. But he knew that the Machine had eyes in ways that he did not. It was possible that this was still a trap in some way—or perhaps the Machine had given him the message because it knew he'd think it was a trap…
There was only one way to gain more information and snuff the Machine once and for all.
And so Samaritan tentatively began to act, his spindly fingers of code twisting around to build his own scathing reply.
A/N: This was way over-due. Sorry, folks. Life's been…crazy to say the least, haha. But I wanted to get this out before Christmas as a gift to you all. I've been looking forward to integrate Samaritan a little more in the story; he's one of my favorite characters in the show, even if he doesn't feel as natural to write as the Machine feels. A little bit more of the overall plan has been revealed in this chapter. As always, I'm interested to hear your reactions and to gather any requests for future chapters. If you'd like to see some additional Recalibration content, please check out my profile, where I have placed the URLs to Madame Renard's playlist and fanart! XD
Merry Christmas and Happy Holidays! Please let me know your thoughts, critiques, questions, and ideas! Thanks!
