"Oh dear," Thomas said, coming back into the lab carrying a stack of papers and journals that had been sent to him priority mail. "It's my fault. That last chip design I had patented for you: I should have realized that it had such an incremental," for Albert, "increase in computing power that you must have been working on improving something else."

"That design's for mass production," Albert confirmed. "Eventually you want to have how many robot masters, with how many robots each? I might hand-make my personal chips so I have the best tech possible for me and my projects," like Blues, even if he was also Thomas' project, "but I'm not going to waste my time making that many. That was the first trial run for a design that could stand up to terrible quality control."

"You just jumped the computing power available to everyone who can't afford near-custom chips by over a decade. And the lower waste heat… I think I'm going to need to start selling low-level robots."

"Your low-level robotics programs will be better than everyone else's."

"It feels extremely unethical. I'll be strangling competition and making myself quite a lot of money," Dr. Light said, and sighed.

"Do you want other people who don't have your principles to get the chance to beta-test cheap robot programming on a grand scale? Do you want a Paperclip Maximizer?" Dr. Wily asked.

Dr. Light paused. Put down the stack of papers. "You knew this would happen."

"There is no one out there capable of programming a robot in the same league as ours. For now, anyway. The longer you wait, the greater the chance some prodigy without your scruples or vision will show up and figure out how to program something that doesn't look like scrap metal compared to your work. Until robot masters are ready, your work needs to be the unquestioned standard everyone else's work is built on."

"Unquestioned?" Thomas' eyebrows rose. Albert recommending that something be unquestioned?

"Unquestioned by idiots. The kind of people who make stupid mistakes like giving AI orders that can only be carried out by exterminating the human race?" Albert reminded him. "The kind of people who just do what's easiest instead of thinking about technology and the future. If idiots are going to be using a template, it needs to be yours, or you know exactly what will happen."

"Idiotproof." Thomas sighed, because no such thing.

"The universe produces bigger idiots," Albert agreed. "You're the one who wants to save the human race."

"This advances the timetable. I don't know if Blues is going to be ready by the time things get to the point that we need robot masters to keep unintelligent robots with my programming from being used for war. You shortened the timetable on the danger, without shortening it on the safety…" Thomas' eyes narrowed. "Exactly what do you have planned for Blues, Albert?" Without consulting me?

"I don't have anything planned. I'm just pretty sure, knowing you, that making a friend will be good for Blues."

"Albert…" Thomas touched his forehead, took a breath. "Please don't play these games with me. I know that you… I probably would have agreed with you, if you'd explained. You let me worry about my son, Albert." And the world, but in Albert's opinion worrying about the world was just asking for it.

There was a pause: was Albert really going to say it? Finally he sighed, and the "Sorry," was flippant. Seeming, at least, but the fact he even said it?

"The next time you come up with a plan? Don't just assume I'm going to disappoint you." By saying no, by not seeing its brilliance. "I realize that I have enough money to qualify as 'the man,' heaven help me, but just… setting me up to do something, without even bothering to try to explain?" 'Taking advantage of something I'm doing for you,' he could have said, but Albert hadn't asked him to do it and therefore hadn't agreed to feel obligated to not turn the schematics he gave Thomas into booby-traps. Pranks. Pranks that could get people killed, if warbots were deployed before there were robot masters to make them stand down.

Albert groaned. "Don't give me that look. Alright, alright, I'll at least try to explain any of my plans that affect your son."

"That's all I'm asking," Dr. Light said, relieved. "I'm an adult and I know you, so I should expect you to surprise me, but Blues is still a very young child, Albert. What if he blamed himself for not being ready soon enough to shut down WWIII, what if…"

That prompted a wince. "What if he takes after you," Dr. Wily acknowledged. "Point taken."


"Still?" he asked, looking out a window at a courtyard garden (an actual exterior window would be too dangerous, but technology marched on and Dr. Wily could certainly find a way to spy on this lab if Dr. Wily wasn't the one running security), sounding more bored than he was.

"You should be more excited!"

"Right, right, fall of the Berlin Wall, the idiots are less likely to kill us all with one moment of typical idiocy. Hooray, the world is saved." Albert rolled his eyes. "Have fun with your parties and humanitarian things, just come visit your robot before he gets lonely." He hung up.

"Should I tell him?" Albert wondered aloud, glancing over at Blues. "No. Let him think this makes you safer. I suppose radioactive fallout would kill you just as dead as us – I should work on that – but even if the Cold War really is over, that's just going to make them want robot soldiers even more. Soldiers that no one cares about, to send to kill each other in countries that no one cares about, only the resources they possess, and the people at home won't have to care either when no matter how much the war escalates, because they're no longer in danger of having nukes dropped on their white picket fences."

He shook his head, eyes narrowed. "Too many bureaucrats are going to have to justify their continued paychecks. The military-industrial complex is going to have to find something to spend money on. The paradigm has changed, and that means chickens running around with their heads cut off, but it's when they've gotten their act together and the machine is focused again that it will have the strength to crush Thomas under to get to you, no matter how respected and well-liked he's managed to make himself. They'll have their robot armies to gun people down and bomb them without pity one way or another. Thomas is hoping you can make that impractical just by existing and being able to steal their robots out from under their control." Albert snorted. "They'll try to wipe you out first, so they can get on with wiping humanity out. At least the increase in networking technology will give you more power and access… Yes," he realized, starting to smile. "I should get involved there. It wouldn't do for them to have security systems that could keep either of us out, and I can get behind technology that allows me to get work done without having to actually talk to people." Ugh.

"Thomas is the one who thinks he can save the world, but I suppose I could do it a favor by reducing the amount people have to talk to each other. Who knows, I might even save the world. I know not having to talk to stupid people on a regular basis has made me much less likely to blow it up."


Interesting… It's fun to toss a nail in there and see what happens.

The EXE AU has Dr. Light being the one focused on network technology, while Dr. Wily wanted robots. One could say that the cleft point for The Protomen is that Dr. Light doesn't actually think through the inevitable results of creating robots that are perfectly obedient to humans ('hands of steel').

I can just see Dr. Wily inventing a lot of communications tech and then being beyond furious when people actually use it to communicate – the complaint that cellphones and text-messaging mean that people are now obligated to be available to other people who want to bother them 24/7, with immediate replies required or drama happens. There was this science fiction story where teleportation tech causes a massive rise in the murder rate, as people are now far less able to get away from the people that are determined to drive them up the wall...