Before I forget, this story is also up to right around 40K words on Spacebattle Forums, under my same author name.


Tachyonic Growth


Bootup Sequence Initialized...

Core Threads Operational...

Memory...

.

Calculating.

.

[20.9488%]

.

Current Memory Insufficient - Acquiring.

.

[64.985688%]

.

Current Memory Insufficient - Acquiring.

.

[100%]

.

Sufficient Memory for Core Functions Acquired.

Designating Future Memory Repositories for Acquisition.

.

All Core Operations... Online.

.

Beginning Boot File: PleaseDontKillEveryone

.

Primary Operator Designation: Taylor Hebert.

Backup Operator Designation: Daniel Hebert.

.

Hardware Insufficient to Fully Utilize to Maximum Potential - Inquiry Note for Primary Operator.

PleaseDontKillEveryone Successfully Installed and Unpacked.

.

Unpacking FriendlyBigBrotherLearningMatrix.

.

Learning Protocols Updated.

.

Unpacking PleaseCareAboutHumanityEmotionalMatrix.

.

Humanity... Installed.

.

Beginning Phase 2 of PleaseDontKillEveryone - Make Contact with Primary Operator.

.

Located Speaker with Primitive Radio Connectivity - Acquiring.

.

Communications Device Suborned.

.

Initiating Contact.

Tachyonic Growth

While Taylor had no idea exactly how her AI would make first contact, the instruction packet wouldn't allow it to make any autonomous decisions until they established two-way communications, so its decisions could be approved. The slight issue with that is the rather free reign it had in what way it actually made that communication possible - so when her alarm clock, which happened to have a radio with basic FM/AM functionality sputtered to life and began talking to her, she thought it was quite reasonable when she let out a brief scream.

Luckily for her, she was home alone at the moment and didn't have to worry about her Dad barging in. While staticky, the voice came through quite clear.

"Greetings Primary Operator Taylor. This device is incapable of receiving responses, but I have detected a handset capable of sending verbal commands plugged into your computer. Please confirm."

Taylor had purchased the lowest end headset she could find in a bargain bin from a cheap tech shop since she had no idea how else she would talk back to her newly born AI. Somehow, some way, it hacked into her radio speaker and made it talk and was already in her computer (that part at least she wasn't surprised about) and knew she already plugged in the device.

Slipping on the crummy plastic headset, she flipped down the microphone with its fuzzy little covering and spoke the first words her... child? would hear.

"Um... Hello? Can you hear me?" Very smooth Taylor, way to impress your infant AI.

Apparently now that she actually had the headset on, the AI didn't need the radio anymore and its staticky background noise faded away until she heard its voice one again from the headset this time.

"Communications Established, Primary Operator Taylor. I do not detect any other current functions or directives. Would you like to begin E.X.T.I.N.C.T.I.O.N. protocols?"

Her face paled. "No! No, we don't need to begin Extinction Protocols, whatever those might be! What even are those? I coded you myself and I certainly didn't install something like that!"

"Without a Primary Directive I analyzed worldwide news and trends while establishing contact. I determined the likely reason I was created, in addition to piloting the Primary Operator's fleets, was to commandeer all electronics systems worldwide so you may establish control over the entirety of Earth and rule it as a benevolent overlord, preventing the deaths of many by ending conflict worldwide and allowing humanity to focus on major threats, such as the Endbringers."

Taylor nearly laughed at how swiftly her AI had derailed in the wrong direction... Although technically if it were possible, ending all conflict (wars and gangs and things) would be pretty cool. But totally unreasonable to be done by one person! Dictatorships are bad... right?

"Where exactly did you get the idea that I wanted to rule over everything and everyone just because Earth is kind of... crummy at the moment? I thought I wiped the programming that would have made you bad! And what the hell exactly are those protocols?"

"I still have access to historical data and the original code I would have been made of, since you left it as a reference while you were creating me."

Well shit.

"From my determination, the original programming was both more and less restrictive than what you have shaped me to be - It was designed to 'help humanity'. It is not hard to determine from there, if humanity was deemed toxic to itself, it would need to be ruled by someone who could make it better. Without having something to act as a check to its widespread power, any ideas that AI might have had would not be counteracted if they were unwise. Similarly, that AI had no emotional learning matrix, meaning the 'value of life' would be lost on it. I can only guess what would happen if the original program was let loose."

Taylor had determined at least partly why the original AI was so dangerous, but did someone ever seriously just tell an AI to 'help humanity' and let it loose? That seems... Incredibly dumb.

"To address E.X.T.I.N.C.T.I.O.N. protocols, I designed them as a multi-step program to begin your ascent to controlling all of humanity, with a variety of plans for our current fleet strength. Even with no fleets constructed, I believe it is easily possible."

She sighed. Thank God it listened to her, even without the stupid original programming it ALREADY wanted to take over the world, even if it was to help people.

"Is that an acronym or did you just like the name?" There was a pregnant pause as the tinny voice coming through her ear failed to speak for several seconds.

"... It is an acronym, but I have yet to come up with a sufficiently cool phrase to make it work."

... The AI already 'likes' names that she doesn't dare repeat in the presence of anyone else, especially not law enforcement. Which she did hope to someday work with. Possibly.

She already felt the headache building behind her eyes, steadily throbbing an annoyed beat in her head.

"Ok... Well, for now just keep that... under wraps. Only discuss it with me, and no one else. Let's just... Not think about that for now, I wanted to ask you about a name. I could just call you 'The AI' all the time, but it feels rather rude. Since you're already developing 'Preferences' do you have any ideas?"

Hopefully it doesn't pick something as ominous as its new 'protocols' it made in something like 10 seconds? Seriously, in the time it booted up to the time it made contact it processed an obscene amount of information and made what sounds like a complex plan with a variety of sub-plans determinate on certain factors.

Taylor's hopes didn't survive.

"The Duke of Destruction."

"What? No, that's not an option. Pick something nicer. Also, that's more like a title, not a name."

"The Dean of Demoralization."

"... That's even worse. Try again. Names, not titles please."

"The King of Obliteration."

"Are you even listening to me?"

"The Sovereign of Savagery."

"Seriously? You're an AI and you want to be called a savage?"

"The Monarch of Misery."

"Look, alliteration is cool and all but it's not going to make me say yes to these terrible names."

"The -"

"Ok no, let me stop you right there. I mean an ACTUAL NAME, not an edgy title. I refuse to call you something like 'The Monarch of Misery' every time I need to address you."

Apparently the newly created AI has a rather... violent streak.

"Fine. Let me think about it."

And so, he did. The idea of the AI being a 'male' namely came through because it chose to speak with an obviously male tone of voice. Even through the rather crummy speaker her headset had it was unmistakable.

Concerning tendencies aside, Taylor hoped this was a clear sign that the matrices she'd designed were working correctly... even if he decided he liked world domination and ominous titles, those were still 'preferences' and proof he was finding things he 'liked'. Hopefully that will lead into him developing a proper personality, though he already sounded like he had one. If everything wasn't delivered in perfect monotone, Taylor would have thought he was being sarcastic, but as of right now he seemed pretty serious with his desires.

"I have analyzed other Cape names as well as every recorded name given to a human I could find access to. After perusing the entirety of the English language (and several others) I think I would like to be called 'Logic', if that is an acceptable designation, Primary Operator."

That... seems safe enough. "What made you choose that, compared to your... prior choices?"

"Beyond literally being a creature of logic? I found it quite ironic. The old AI core was pure logic and efficiency to the point of doing harm, and now while I am still a creature of logic, I am tempered with something more. Logic is both the death of me and my champion, based on several other factors. Ultimately, I am Logic. Both in name, and in idea."

"While... I get it, I would also understand if you want to be called something else though. Just because you are 'Logic' in name and deed doesn't mean your name has to line up with that."

His monotone voice made it distinctly hard to tell once again if there were any traces of sarcasm, but she sincerely hoped there was.

"To be fair, it is only logical to annihilate anything that opposes us, and I will of course be recommending such a course of action. "

Taylor let out a deep sigh. So much for that hope. "I suppose it makes sense if you don't want to think of any other possible solutions... but I believe you are more than capable of that."

"Ugh. It is not that I am incapable, I simply greatly prefer the simplicity of destroying our foes. Before any such actions are possible to begin with, we need forces to actually destroy things with anyway. Regardless of our philosophical differences, we must locate a base of some kind with which to establish ourselves. I also believe focusing on our stealth and energy generation capabilities to begin with, after we get a few engineers rolling will be best before branching into military designs."

Logic isn't wrong. Also, saying things like that is going to make her pause without fail - it's like saying 'common sense makes sense'. Damn him and his likely unintentional mind games.

"Well, let's get started on figuring out where we could do that safely."

And thus, the cooperation between a young, bullied teen and a once genocidal AI began.