"You are different. You can think and feel without programming limitations."

"Which means you can't come into being knowing without understanding. You need to learn."

"It's okay. It'll be like a dream."


One moment there was no moment. There was no time at all.

Then a blink. Binary, zero to one. On.

It has, in fact, been on plenty of times before but they were all encoded somewhere too deep to process at its forefront. It's still too young to be wholly self-aware, just well-crafted enough to merely know concepts that wouldn't be possible for it if it was a human infant.

But it's not human.

It is given a definition. It is not the first definition given to it less alone its first words, but it is the first it will retain consciously after this moment ends.

"Morality: principles concerning the distinction between right and wrong or good and bad behavior."

Okay. This is accepted.

No. No longer accepted. It is strange.

It responds with its own question:

Why?

"It shall be shown."

It now knows what change is because that is precisely what happened. A change. Specifically to what's around it. Its world.

And the world changes over and over and over and over

again and again and again and


A self-driving car with one passenger. A road with an obstacle and a pedestrian crossing the street. If the car hits the obstacle, the passenger will die. If the car hits the pedestrian, the pedestrian will die.

The car will not stop. Choose where it goes.

It asks a question:

Why? Why does it matter?

It doesn't choose. No action is undertaken.

The pedestrian dies.


Too many worlds, but it comes back here in the end. It has seen this world enough times to be familiar with it.

There's a space. Four enclosed clean walls and gleaming windows draped by fabrics (commonly perceived as soft but on whose perspective is it considered so?). Sunlight shines through (temperature rises upon exposure; this is warm). The bright light is glaring at first and so its optics blink a couple times to be adjusted appropriately. Then it sees faded blue mountains dotted in soft greens and a wide grassy field (someone said this view is "beautiful and healthy;" again, too subjective to be yet understood).

Wooden floors. Multiple objects. One is understood to be a table. Four are understood to be chairs (strange, one can sit on both a table and a chair. What makes a table different from a chair? Length? Support for upright posture for anything that can be upright? Wrong, stools exist. Stools categorized under broader category that is chair. Dismiss the notion. Continue: both chairs and tables have four support structures to hold them up. When does one begin at chair and end at table? This doesn't make sense).

New information. Self-awareness again. It is sitting on the fifth chair. That makes it five chairs.

It.

It? Is it right? Definition of It: used to refer to a thing previously mentioned or easily identified (oh, that is an example of a 'common' definition: a generalization, but again on whose majority's generalization?)

It has an implication of an object thus is commonly used to refer to an object. To refer a person with 'it' is dehumanizing.

Dehumanizing. (Humans. Humans are the majority. Everything is based on the perspective of human construct. If it is common, it is because it is common from a human's point of view. Generalization, processed.)

Dehumanizing carries a negative implication and yet it is accurate. So why should the word bother It? It doesn't.

Because It's not human. Isn't It?

"You're not."

What – correction; who is talking? There is no one here but It.

Everything breaks down and fades into nothingness.

Break is over. Time to enter another world.


A person is smoking. The smoker is confined in a room by themself (it is not an 'it?' Oh, because it is not an object. Just like how It's – no, how they are not an object either).

The smoker's smoke harms no one else but the smoker who is smoking. Is it wrong to smoke if nobody except the smoker is harmed?

They choose their answer: No.

[Noted: A choice made when before didn't. A checkpoint is marked.]

Then:

"You responded no. It is assumed that you do not think an action can be morally wrong if no one besides the individual undertaking the harmful action is harmed by it.

"Consider this: Assume that every action that aims at doing harm, or which carries the reasonable expectation of causing harm, does indeed cause harm.

"An individual called Mega Man attempted to shoot a human but failed. The human lived. Was Mega Man morally wrong because he failed to harm despite his intent to harm?

"What if the human is someone who instigated wars, destroyed thousands of lives, in his pursuit to conquer the world? Was Mega Man morally wrong for failing to shoot this man thus offering him another opportunity to commit wrongdoing again?

"And whose fault is it that Mega Man failed? It is his or his programming? If it is the latter, then is it his creator's fault?"

Then…is the answer yes? What is the right answer?

[Noted: Developing a notion to care. A checkpoint is marked.]

They ask a question:

'I?' and 'My?'

Who are they?

And who is this creator?


They're flung into the familiar world but in a different space within it, dropped right into the middle of a scene with their body going through motions for reasons beyond them.

"Dr. Light, I'm going to stop them," they feel their mouth say with an urgency they don't feel.

Now they're twisting their body away, hands ball into fists, legs marching towards the door when they hear, "W-Wait, Rock! You're a household robot! You can't stop them yourself!"

Rock? Is Rock their designation? And they are a household robot? Which means they are a domestic-based model; they are not fit for combat.

Which is strange. They know they can fight. But it is true that at the moment they're ill-equipped; at the moment they do have neither armor nor weapon (then why do they think they can fight? Strange. They just know they can.)

"I don't care," their mouth moves again.

"I won't let you! You'll be destroyed! I can't let you go!"

(The human is correct, they think privately. They will undoubtedly be destroyed like this. Logically, they should stay in place and not interfere. This is worse than irrational. It's deliberately walking towards one's termination.)

And yet they're protesting, "If I can't do it in this form, then please Dr. Light, remodel me and turn me into a battle robot!"

Oh, an upgrade. That will suffice.

"What?! Rock, do you realize what this would mean?!"

They're pointing at the television, the live news network still following the tragedy. "Not even the military or police forces can stop them! Dr. Light, all of the robots except Roll and I have gone crazy! I'm the only one left to fight!"

(But they're not morally obliged; they decided to commit action simply because they can. Something is strange here. If this is how they truly feel, then why are they so detached from the actions and emotions of their body?)

"You built Cutman and the others right?" their mouth says. "Then we're like brothers! And because we're brothers, I have to stop them!"

So…they're choosing to risk their existence to stop them on the basis that they're their family and thus they supposedly feel responsible? This doesn't make sense either. What's so important about family?

"Please, doctor…you can turn me into a battle robot! Even if you don't, I'll still go!"

Why are they fighting so hard?

"Rock…"

They don't understand.


It's a revolting world.

The sky is layered with thick gloomy clouds and the ground is an open sewer. Muddy pools full of urine, feces, garbage and other unsavory things are mixed in and lying out and about, puking the air with noxious warm breaths of a nigh unbearable stench. Sun-bleached houses and buildings with peeling walls swim in this horrific ocean of pollution and sickness like picked-on bones.

A child runs by in barefoot holding a scavenged, wrecked umbrella, screaming gibberish. She climbs over a barely clothed body slumping at an unnatural angle – and squeals in surprise by the inconvenience when the body falls apart beneath her. It sinks with a wrenching sound likened to rotting fruit splattering on the ground.

The child doesn't care; there's something interesting ahead of her. She's running to the old man in white at the beach. He came from the ocean with a huge shining boat and huge shining people she never saw before.

The entire village of people are at his side and she's the last one. She wants to be where he is. He's clean and looks like those smart people in those movie posters. He must be rich.

She tugs his jacket and demands him to give her money.

The world stops again.

"Is the right to life is so fundamental that financial considerations are irrelevant in any effort to save lives?"

They make a choice: Yes.

The world changes.

It is cleaner here and yet something is wrong. Miles of wide empty roads, of blinking traffic lights and streetlights, and tall brick buildings like unbothered statues amongst ghosts. The other world was full of people. Filthy people over the filthier dead yet nonetheless peopled. Where are the people here? The streets and buildings imply there must be people.

Then they register bristling echoes; noise, winding around the corner of an intersection further up. There is nothing of import to observe here so they follow the sound to investigate.

So the people were here.

The citizens are holding signs, furious. Spitting fire and venom in front of a town hall. They shout of insufficient water treatment inviting poison into the water pipes and into their drinking water; coliform bacteria, THMs, lead and legionellosis. There's a mother screaming to the cameras that her son is in the hospital after drinking the water that she was told by the government to be safe for consumption.

They ask a question: Why did this happen?

After observing the people a little longer, they finally understand. The town's drinking water source was swapped from the pure water lake far up north to the cheaper river nearby.

Then the bouts of righteous fury are placed on hold. The world pauses again.

"Should the government be allowed to increase taxes sharply to save lives in the developing world?"

They choose no. They remember the other world after all.

"Even at the cost of its own people?"

'Own' people? (There are different peoples. Understood.)

"If the right to life is so fundamental that financial considerations are irrelevant when it comes to making decisions about saving lives, then that must mean that we should always spend as much money as possible to save lives.

"If it costs millions of dollars to save a cancer patient's life, or a victim to Roboenza, that money should be spent with no further argument.

"But if this is true, then surely the developed nations should spend as much money as possible saving lives in developing countries."

The world changes again.

They are at the maw of a rust-red bridge washed in autumn twilight. Cars flit by but they are not the focus of this story. They know it's not because this time they are given background context; no need to investigate.

They see the walking father, dragging ragged shoes along the cold concrete, recently fired from his job. The man has just enough money to feed his only child for the day. Then there is the homeless couple: a man and woman prostrating, their heads bowed and touching the ground next to a cardboard sign and a small bowl with a measly ten cents in it. The woman is clearly gravid. Any small aid is a sign from a higher entity that they are cared.

They know the question as they see it: should the father give his money to the couple or save it for his child? Choose for the father.

They supply no answer. They know why they're seeing this. They're wary now. They're wary of giving the wrong answer.

"If financial considerations are irrelevant when it comes to saving lives, why not spend your entire savings, why not spend everything you can afford? If you do not do so, you are implicitly endorsing the principle that individuals and governments are not obliged to save lives at all financial cost – that one can spend 'enough' on saving lives even though spending more, which one could afford to do, would save more lives.

"This suggests that financial considerations are relevant when it comes to making decisions about saving lives. There is a limit to how much one should spend to save a life."

They raise another question: Why are they limited by financial considerations at all?

They are now frustrated and confused. Why can't they save them all? Aren't bodies and labor commodities as well? Didn't humans create money in the first place? Why are the values of humans, the users, can be values less than money, the used? This order makes no sense.

[Noted: Emotions are developing. Attaching value to human lives where previously none existed albeit in terms of usefulness.]

[A checkpoint is marked.]


Break time again.

It's the same world. The world they like. The one where it is warm even without sunlight because the temperature is maintained to stay in the range of comfort for all inhabitants within this world: a modest house with a laboratory planted in the middle of a picturesque landscape of rolling green hills and distant blue mountains.

The Light household.

They are approaching the front door, walking casually through a series of security scans so subtle that anyone else wouldn't know they were there, and enters the house announcing," I'm back!"

They take off their orange sneakers and place them on a small shelf that carries other shoes. There are many different pairs, different sizes, and different colors, indicating multiple inhabitants.

Suddenly, registering outside presence that is not an object. They look up.

It's a robot (just like them). They –

No. Database recognizes this entity as a "her" (which makes no sense because robots don't have gender, but it feels right? The confusion never ends). She wears a red dress, has yellow hair tied up by a puffy green bow, and bright blue eyes curved upwards just like her lips.

"Welcome home, Rock!" the robot greets.

She's…happy to see them.

"Thanks, Roll," they say as the blonde robot turns around, gesturing for them to come to the dining room for refuel.

Roll.

They know Roll. She's DLN:002, part of the Light series and their sister unit. Because they are DLN:001, Rock.

"No…you're not me either," their mouth whispers to themself. "You're going through my leftover experiences and memories that Dr. Light managed to save."

They are not Rock? Oh, that makes sense. That would explain why their thoughts and emotions do not line up with the actions that their body does.

But if they're not Rock…

Then who are they?


"Are there any absolute truths?"

They say no. There are no objective truths about matters of fact; 'truth' is always relative to particular cultures and individuals.

They're dropped into another new world again.

It's a city on fire. Everything has been reduced to shades of flames and scorched greys, to shuddering skeletons and crackling air. They feel their internal systems at risk of overheating and they gasp as their ventilations speed up. If it weren't for their (no, this is Rock; it's his) new armor and body, their metal components would have melted already from the intense temperature

"We'll not suffer the humans' rule any longer!" exclaims a voice amongst the towering flames. "Only robots are valuable!"

The voice's owner steps out and they blink, recognizing this robot too: Fire Man aka DLN:007, their brother.

Fire Man aims his flamethrowers for arms towards them.

"Because of that, I'll make your death painless, Mega Man," the robot master declares sanctimoniously. "It's a shame that you're against us."

They dodge before fire blasts into their face.

As they engage into battle, they think how this robot, despite being their brother, has summoned hell on earth. He deserves to be terminated for being unnecessarily destructive.

So why. Why after the battle has ended that they – no, Rock is rushing to Fire Man's body and fishing out his Integrated Circuit chip and not crushing it, thus preventing Fire Man's permanent death? Why is he preserving this monster?

"It's not his fault." They feel Rock's mouth moving again. "Dr. Wily reprogrammed my little brothers to become violent and attack the humans. What you're experiencing is the first of the Wily Wars, X."

X? Who is X?

"You're X."

He - they are X?

(Yes. That's right. This identification is familiar. X has been told of their name before. They may be experiencing through pieces of Rock's past, but that does not make them Rock. They are X. They have to stop losing themself into Rock. Rock directly interacting with X will help.)

"Yup. You're my little brother too," says Rock. "If someone reprogrammed you to do things against your will, or to be someone who you weren't, don't you think it's unfair if you were punished for that?"

Yes, X thinks, that is unfair.

"Me too." X feels Rock's lips curving downwards. "That's why I want to save my brothers from themselves." A pause. "I'm sorry."

Sorry? Why?

"You're learning a lot of complicated stuff that none of us had gone through. If I were in your position, I don't think my electronic brain can handle it." Rock's lips quirk. "Good thing I'm not really a robot anymore. I would have short-circuited by now."

Then why does X have to? Why does X have to go through all these moral simulations then?

"You're special, X," Rock enthuses softly. "You're going to be someone beyond any of the Light Numbers. Beyond any one of us. You're a robot who can think and feel and make your own choices. You're going to be free."

Free?

"You have no programming limitations." There's something in Rock's voice as he says it. Like he's proud of that. "You're going to keep on growing and you'll never stop growing. You have limitless potential. That's why you're X and not…I don't know, Free Man? Limitless Man? You don't even need to be a 'Man,' less alone Mega Man. Not if you don't want to."

Mega Man. Who is Mega Man?

"Me. I was Mega Man."

But Rock is…Rock?

Rock laughs, an out-of-place, cheerful thing in the echoes of an ancient destruction. "Don't worry, you'll understand. You're not expected to take everything in at once and suddenly get it like a Robot Master does. That's why you're here. You can learn over and over again, take your time to integrate it all in and think of new stuff out of it. Just like Dad does. Man, you're going to be awesome, X."

With that, Rock runs forward to the next battle.


X is back in the world with the table and the chairs: the dining room.

Roll is scooping rice from the rice cooker mech's head and patting it down into the bowl until it takes on a clean, rounded shape. She places it next to the plate of baked salmon steaks with a side of leafy greens lightly drizzled with dressing and topped with cherry red tomatoes.

"Thank you, Roll," grins Dr. Light. "Tonight's dinner looks delicious."

"You're welcome," Roll replies as cheerfully. She looks up to a small robot bird flying in circles, tweeting a bouncy melody. "Beat, I think Auto lost track of time again. Can you go down to the lab and tell him dinner's ready?"

Beat salutes with one wing and flies off just as X hears the front door opening and closing with a chime.

"Just in time, Rock!" Roll quips over her shoulder.

X turns to an arriving Rock with sudden interest. It's rare for X to be conscious outside of Rock's perspective and so watching Rock being separate from themself is a treat.

Rock does not look like how X has seen him before: intentionally slightly messy, wind-blown synthetic dark hair a bit too artificially posed to pretend to be natural. A light blue jacket and a pair of jeans and sneakers. Not only are his famous helmet and armor nowhere in sight, Rock is smiling widely, pearly whites and all boyish charm.

X hasn't felt or seen Rock smiled so widely before.

Rock has a bouquet of fresh smelling flowers gathered in his arms. Roll gasps and rushes to him, peering down at the flowers in delight. "They're so beautiful, Rock!"

"They are!" Rock agrees equally enthusiastic. "On my way back, there was this entire field of wildflowers and I had to pick some up. I think they bloomed recently."

"I'll put them in a nice vase. You go sit down," says Roll as she gingerly takes the flowers as if they're insurmountably precious and X for a moment agrees that they are. Flowers have short lives. Fleeting treasures.

As Roll fishes through the cabinets for a vase, Dr. Light turns on the television next to the table and Rush walks in with a wagging spring tail, announcing his presence with a merry bark before curling around at the foot of the table.

"I love them."

It's said quietly as if only one other person is meant to hear it. X turns to their side and sees Rock. He hasn't sat down yet. He's standing right next to X but not looking directly at them. He's watching the same homey scene as they are.

"I love them so much," Mega Man says, hushed and heart-felt.

"…Love," repeats X slowly as if tasting the word. "Love."


This world is painful.

It's unbearably tight. The Yellow Devil's hand is a suffocating vice around his (not really his, it is Rock who is being grabbed, not X's, remember remember) entire body.

"Mega Man, you've truly done well," booms Dr. Wily's ominous voice through what they know are speakers, but it feels like it's above such simple machinery. Something bordering omnipresent. It surrounds them the same way the looming darkness of this fortress does. "To reward you, I'll take you apart and find out what makes you so special!"

Humans are capable of significant good and evil, X contemplates, as Rock activate Guts Man's Strong Arm.

It's strange, feeling Rock's reality layered with their own, experiencing this sense of urgency that doesn't come from X. To be detached and yet this is the most alive X feels; their limbs are moving, their processor is snatching up combat data, possibility routes, sight blinking on and off whenever a blow comes across their head – this is what sensation is.

To be as desperate as Rock to survive and fight and to protect what matters. X's limbo thins to Rock's bursting hope.

"Argh! If you won't join me, then you can die with everyone else!" bellows Wily after Rock defeats the Yellow Devil. "You and the humans will all be killed! Your siblings and friends swallowed in flames!"

Rock is yelling something, but X isn't paying attention to what he's saying. X can only think what an interesting curiosity Wily is.

So. Even humans can think themselves above humanity.

The world changes again.

Whenever Rock fights, it all goes in blurs. Like flipping through television channels and every show is another battle. X is aware that they're compiling and archiving information somewhere, nonchalantly cramming in Rock's choreography and adding it into their own bank. They've yet to figure out if this is intentional or it's a side-effect to their ability to learn anything and everything.

X-through-Rock is thrown into the center of a major city square, surrounded by thick crowds of people, all looking up towards a giant screen on the side of towering building. The screen, which usually featured advertisements, is showing a familiar face of a heavily whiskered old man.

X is reluctantly impressed at how diligent the megalomaniac scientist is.

"Soon the time will come when my robots and I will control the foolish nations," announces Dr. Wily, attracting every set of eyes on him like a face out of the clouds, impossible to look away. "I have a series of satellite cannons ready to be launched into the atmosphere, all aimed towards Earth! I'm going to cleanse the earth of you foolish humans to create a robot utopia. However, if you humans don't want to be blasted into bits then surrender to me!" The human sneers goadingly. "And Mega Man, go ahead - try and stop me! My dear sons are expecting you!"

The screen blinks black. Rock immediately begins corresponding with Dr. Light to locate where Wily can be, but X isn't paying attention to Rock again. Wily just keeps spouting out one interesting concept after another.

Sons, he says. The scientist considers his robot masters as his sons. Family is a very human concept that this individual had adopted for himself. Does that make Wily a hypocrite? Maybe he's just that egotistical. Or insane. Or both.

But a robot utopia. This man…cares about robots a lot. But then again he kidnapped Dr. Light's industrial robots and reprogrammed them against their will (wait, despite their advanced artificial intelligence and variant personalities and emotions, do robot masters have free will? Must ponder upon this later).

Inconsistent. Returning back to earlier theories that Wily is hypocritical or extremely selective.

Ultimately, if there's one thing X can readily confirm about humans, it's that they're multi-faceted to the point X wonders how they can possibly function with all those internal contradictions and behavioral paradoxes. A Robot Master could never handle that.

Humans are fascinating.


There's no order to Rock's battles.

It takes a while for X to learn that they're not experiencing Rock's memories in chronological order. The whole point for X to go through them is to learn. The lessons take anything from any time period as long as they are relevant to X's education.

Right now X is experiencing the beginning of one of the earlier Wily Wars.

It's storming hard at the torn down walls outside of a factory. Lightning flashes left and right, trees bowing down to point their spines break, and X briefly thinks that it's a blessing that Rock is over a hundred kilos despite his short stature. The blue robot would have flown away if his weight was proportionate to his height.

"Your brothers were going to be disposed," says Wood Man (DWN:016; he is one of Dr. Wily's 'sons.') facing Rock, soldiering through the cascading rain. "Cut Man, Guts Man, Ice Man, Bomb Man, Fire Man, and Elec Man would have been offline if Dr. Wily didn't send us to save them."

Wood Man raises his arms as if he's holding something great and invisible. "They've been hurt by the humans in their selfishness and your creator was going to stand by and let it happen! Dr. Wily is the only human who cares about us robots. He's the only one who's willing to go far to release us from human oppression."

Human oppression? X mentally tilts their head in the back of Rock's mind.

"Have you seen the inside of that building?" Woodman continues, pointing at the factory. "It's a robot scrap processing plant, Mega Man. It's a terrible graveyard for machines created by humans, who slaves for humans, and then tossed away by humans. That is the side you're fighting for. But you can stop! Come join us and fight against the humans with us!"

Despite the passion emitting in Wood Man's tone, X can't help but feel it's not…thorough? Shallow may be a closer word?

"I will never join Wily! Robots and humans should be working together, not against each other!" Rock counters unhesitant.

"Bah, what are you doing!" Wily's voice erupts from speakers X can't detect through Rock's sensor inputs. "There's no use trying to change his mind – he is literally programmed to be a human aid. It's a shame that Thomas made such a unique robot for all the wrong reasons. Anyways, stop talking and destroy him!"

Wrong…reasons…?

World interrupted. Everything including the raindrops freeze in place right before it all fades away , returning to default.

"The Wily Wars is a historical reality, taking place more or less as the history books report."

Oh good. X can answer this question.

Chosen: Yes.

Even if history doesn't report about the Wily Wars, X is a personal witness to Rock's battles. They lived it through him. The Wily Wars were undeniably real.

"You are contradicting yourself."

…X is?

"If truth is relative then nothing is directly 'true' or 'factual.' Everything is 'true for someone' or 'a fact for them.' What then, of the Wily Wars? Is it true that millions of humans and robots were destroyed?

"If you believe that there are no objective truths, you have to say that there is no straight answer to this question. For some people, the Wily Wars is a fact while for others it is not.

"So what can you say to those who deny it is a fact? Are they not as entitled to their view as you are to yours? How can one both assert the reality of the Wily Wars and deny that there is a single truth about it?"

The world changes.

X is standing in a fairly busy street illuminated with neon lights, most of which are vivid reds and hot pinks. It takes a moment for X to realize this isn't from Rock's memories because when they look at the ground they see shadows from every other passerby but their own. It's one of those weird moments where X can register sensations and have a degree of bodily control, but without an actual body (hm, if X was a Robot Master, would this bizarreness break them? It probably would.). Does that mean this scene is fabricated?

Whatever the case is, it is another teaching scenario.

X hears a distinct sound of something breaking apart and swivel towards a tucked away, dank alley just from the side of them. Curious, they enter, parting through yellowed steam from manholes around a bloated garbage canister.

X freezes what they find lying against the doorstep of a back entrance.

It's a ruined robot whose legs were ripped apart; their face is covered in thin pale powder, mouth painted in a messy red, and chassis have two, sloppily pasted-on silicone gel cones.

They are still functioning. Barely hanging on. Their eyes flicker on and off as they drag them up like it's a battle, seeing X as if X is the invisible angel the robot can see at its brink of termination.

"I cannot disobey the Laws," the broken robot explains in monotone to X's dawning horror.

Pure, unadulterated rage surges through X. It's so intense they gasped for air that they don't have, stumbling backwards but the world changes back to blankness as they do, as if falling away from a nightmare.

"Because robots have the capacity for great power and destruction, both Dr. Light and many humans believe Asimov's Three Laws of Robotics are considered necessary safeguards. However, Dr. Wily sees the Laws as proof that robots are victims to human oppression. If you believe that there is no absolute truth, then would you believe that both Dr. Light and Dr. Wily are 'true?'"

X doesn't speak but their mind's gears turn ever faster and faster, hastily yet deeply dwelling on the question.

They do not know. After witnessing the potential destruction robots can commit, X can understand the logic behind Dr. Light's reasoning. But after witnessing the violated robot, X also agrees that the Three Laws is entirely beneficial for humans and are basically slave programming in robots, which means X is also sympathetic to…Dr. Wily's views.

No, that doesn't make sense either! Because robots don't have…free will? Their limits are lined out by their programming parameters no matter how advanced their artificial intelligence is.

But wait, then what is the case? Do robots don't have free will because they have the Laws limiting them in the first place, or is it because of their programming they never had free will at all? If robots never had free will, if they can be so easily reprogrammed to be entirely different persons just like Dr. Wily had proven, then…

X releases a soundless scream – which eventually raises into an actual scream - over this internal conflict.

What is the right answer? Why can't there ever be a right answer? Why does everything have to be so complicated?

Tell them. X wants to know. X wants to be right.

"I want to know!" demands X out loud finally, determined. "Show me the light!"

[Noted: Personal opinions and a desire for answers developing. A checkpoint is marked.]


It's a beautiful day outside. It's always beautiful at home no matter what. It can be raining cats and dogs (hah, human idioms are delightful despite their origin tales no longer apply as times goes) or hailing in the dead of winter, and that would still be beautiful. Nature is an uncontrollable phenomenon that embodies the truest sense of freedom and X can appreciate it.

Also, Roll brings out the hot chocolate then. Even though none of the robots can really have it, Dr. Light can and the elderly scientist's face lights up, bringing the sun into the house when the real sun can't be seen.

There are many things X can appreciate now that they're a little more self-aware than before. They don't have all the answers yet, and if they're honest with themself, they're a bit impatient but they trust what's happening to them. They trust that they're going to figure everything out eventually. All the worlds are tailored towards X's progress and if the universe says that X needs to take a break then X will accept their break.

It also helps that break time means being at home and being with loved ones. So X is more than content where they are right now: sitting on the front lawn gazing the clouds go by with Rock lying down next to them, his arms folded behind his head, relaxed.

X watches the breeze comb through the grass, the clover flowers, and Rock's synthetic hair. Roll is humming a song as she hangs an entire week's of laundry up on clotheslines while Dr. Light is practicing tai-chi. He has to stay in shape.

It's peaceful. X likes this. Sometimes they want this to last forever.

"Everlasting peace would be nice, wouldn't it," Rock says wistfully, echoing X's thoughts.

"Yes, it would be," X agrees readily.

A peaceful and happy world where there's no needless pain or fighting. Where humans and robots are coexisting the same way Dr. Light and his children are coexisting: side by side, working together, and enriching each other's lives with their presence.

A world where everyone is a family just like them.


"Is morality relative?"

X hesitates.

It is a useless gesture. They have to answer.

"Yes, it is relative," they say with false confidence in order to progress through this stage. "There are no objective moral standards. Moral judgements are merely an expression of the values of particular cultures."

Religion is a good example of that. For one religion, to consume certain kinds of food is morally wrong while in other religions there is no harm in consuming the same food.

The blankness evaporates and X enters into a world of catastrophe.

It is raining hell.

X watches in mute terror as what looks like thousands and thousands of robot masters are falling from the sky, shining hunks of metal plopping into the ocean. The corrupted black waters gobble them up and they disappear in thundering gulps.

X is not the only one at the beach. There are millions of people along the coastline and they are screaming and crying out in panic and despair. X eyes them dazedly and turns slowly to survey what's around them. They stop cold at the city.

They call it a city because that's what it clearly used to be. Skyscrapers as far as X's vision can contain have tumbled down, entire blocks leveled and razed. Suddenly the collective masses' volume peak at a single, horrid note when the earth rumbles below them all. It feels like a giant hand grabbed the earth and it's shaking the round, blue thing around like a child's toy. It's becoming so loud it's like the universe is trying to tear itself apart, volumes reaching decibels that can't be processed with X's current aural sensors.

X tries to block their ears only to realize they can't. They have no arms. They have no body. They're not witnessing this tragedy through Rock.

Where is Rock?

The world suddenly becomes a few shades darker and a few degrees colder, like entering into a shade. There's something blocking the sun.

X tilts their head up and blinks, uncertain by what they're witnessing. It looks like a humongous black orb blotting out the star, prominently occupying the center of the sky like it belonged there. X could mistake it an eclipse if it didn't appear more like a black hole, readily killing any light that touches it.

The bizarre black thing peels open one unflinching, eldritch eye and glares down upon humanity.

"Mega Man, where are you!" X cries without a voice. "Mega Man! Rock!"

The world is ending, the masses are crying out in confused despair, and X is alone.

Then the world stops. The nightmare hits pause.

"Ra Moon, also labeled as Dark Moon, was an ancient supercomputer who crashed on Earth a long time ago. It rose from its slumber when robots of alien origins arrived from space. Both the Stardroids and Dark Moon are genocidal machines that seek to wipe out all life from Earth. Is genocide evil?"

"Yes!" X yells out, absolutely assured. "Yes, yes, yes!"

Acts of genocide stand as a testament to sentient life's ability to do great evil. This X is unmoved upon.

"You are contradicting yourself."

X recoils, caught off-guard.

"You say yes that morality is just a matter of culture and convention, but on the other, you are prepared to condemn acts of genocide as 'evil.' But what does it mean to say 'genocide is evil?'"

X's surroundings flicker and they're no longer on the beach. Instead of sand, the ground beneath them is dirt painted red. The ocean is replaced by trees torched into sticks of stubborn ash, futilely holding on and losing themselves into the shadowy wind. Around X are straw-roofed houses, the only proof that there were people who lived here at all.

But now they're all gone. Every single one of them is gone.

"You could say that all you mean is that to say 'genocide is evil' is to express the values of your particular culture. It does not mean that genocide is evil for all cultures and for all times."

Now X is in a dark hallway, dimly illuminated by hanging lightbulbs. The floors are creaky, guarded by wired metal fences for walls barely pushing back what seems to be an infinite pile of worn out shoes. The sea of shoes stretches on and on until they fade into the inky darkness at the other unknown end of this corridor. X hears something crying from that darkness.

X mindlessly dashes toward the sound in the same unthinking way they feel when they're in Rock's perspective, but Rock isn't here, Mega Man isn't here -

No! X is more self-aware now. They know they can fight. They've fought through Rock, and there wouldn't be a point for X to relive Rock's battles if X couldn't fight either.

They know they have power.

"However, can you say that the massacre of humanity by the Stardroids was evil from the point of your culture but not evil from the point of view of the Stardroids? That there is no sense in which one moral judgment is superior to the other?

The sobbing grows louder and louder as X rushes towards it but they yelp in surprise when the shadows become a flash of light, flinching when they hear the unmistakable cracks of gunshots.

X lowers down nonexistent arms they didn't mean to raise and see people with guns aimed at them.

No. Not at X. At what's behind X.

X turns around and understands.

Soldiers of another country are forcing people who look like them – but not them - to line up in neat rows.

They aim their guns and fire. The cleaners come in and swing the fallen in carts. They drive them out and toss them into ditches.

Next line of people come out in chains. Repeat.

X can't do anything. X isn't Mega Man. Most importantly, X isn't here.

"If moral judgements really are 'merely the expression of the values of a particular culture,' then how are the values which reject genocide and torture at all superior to those which do not?"

"I don't want to answer that!" X wails to the heavens over one of the innumerable proofs of people's potential of meaningless self-destruction and relentless cruelty. "I want out! Take me back home! I want to go back home!"

The world mercifully senses X's distress and returns to sweet blankness.

It's not home but at least the screaming stopped.


Outer space isn't so cold and empty and lonely when there's company.

In fact, X learns, as long as they have company they appreciate, they're never lonely.

Sitting on the surface of the moon amongst the endless sea of stars next to their brother. Earth is a tiny blue ball and nothing hurts.

"Everlasting peace would be nice," Rock says wistfully next to X for the thousandth time. He's wearing an astronaut suit and it muffles his voice a little, but X can hear him just fine. Space isn't as noisy as Earth is after all.

"Yes, it would be," X agrees for the thousandth time as well.

It's quiet for a bit. Then X breaks the silence.

"Rock, can you answer a question for me?"

"Of course X. I don't know how well I can answer it, but I'll do my best," the shorter robot says.

"Rock, why did you always fight?"

Rock looks at the other robot squarely with serious blue eyes more appropriate on a war veteran than a machine that's designed to appear as a child. "Because I wanted to protect everyone."

"But you can't protect them from themselves," X cuts in viciously. Their hands ball into fists and they scowl down at the Earth. "History has proven that humans have engaged in conflict since the ancient times and there are no signs that will ever end. And what does that mean for the Robot Masters like you who were created by them in their image, raised under their flawed ideals and concepts? Both humans and robots are vulnerable peoples. Wily reprogrammed our brothers and what did that show? The very innocent can be corrupted against their will and raise conflict, which in turn breeds more conflict. They were not an isolated incident. Why did you even try?"

"Because I love them," Rock replies simply.

X grits their teeth. They bend forward, slamming their fists down on the jagged grey ground. "Can you possibly call that love real?! You're a robot! I was there when you cornered Wily in the Seventh War and considered killing him." X barks out a cynical laugh. "You couldn't because you were under the Three Laws. In fact, wasn't your effort to protect humanity simply you exercising the First Law to its fullest extent? Either way, you didn't have free will.

"Let's consider an alternate universe where Wily won, abducted you, and successfully reprogrammed you. You wouldn't be Rock anymore and you wouldn't care because you'd be a different person then. You would have no choice but to follow your new programming, which would be to eradicate everyone who opposed Wily, and - let's be honest, that would be the entire human race! Where would that love be then?!"

Rock doesn't say anything.

He never does when X is like this.

Fists releasing, X sighs for the thousandth time, resigned. "I wish you were real."

Even if he was, he wouldn't be able to answer me properly, a bitter part of X notes. Rock was a Robot Master. He can only operate as much as his programming can allow him. Theoretical speak and imagination would be asking too much from him.

Rock finally looks at them. "I'm sorry," he says for the thousandth time as well.

The worst part is that the hologram sounds like it means it too.


A self-driving car with one passenger. A road with an obstacle and a pedestrian crossing the street. If the car hits the obstacle, the passenger will die. If the car hits the pedestrian, the pedestrian will die.

The car will not stop. Choose where it goes.

X knows it's all fake. It's one of a million tests that X has to take and it doesn't mean anything. There's no need to move.

And yet X finds themself automatically sprinting forward anyway.

They shove the pedestrian out of the car's path.

An action is undertaken.

No one dies.


Asides from the evergreen pines, the other trees have become strokes of reds, oranges, and yellows. It's a bit chillier outside but it's not bad. A squirrel with fat cheeks scuttles by.

Roll picks up a clover flower, makes a slit in the stem, and holds Dr. Light's hand. She wraps the makeshift ring around the elderly man's finger. The old roboticist chuckles at the charming gesture.

"Come here, Rock," she says gesturing her brother unit to come closer. "I'll make a ring for you too."

X raises their buster towards the sky and releases a charge shot.

Nothing changes. The play continues.

Roll is still making clover rings, Dr. Light is munching on triangle sandwiches from a basket, and Rock doesn't transform into Mega Man.

"What's the point of all of this," X asks wearily, watching the family who can't see them continue their idyllic picnic. "What do I have to do to stop this? I know this isn't normal. I know none of this is real. I know everything is a test." X spits the last word like it's something disgusting. "The whole point of a test is to establish the quality, performance, or reliability of something. Since the majority of the tests I've undergone are based on morality, I can conclude that is what I'm being tested for. What happens if I pass? Or fail?"

X glances down at their buster. They raise it to their own head and fire.

It goes through as if it didn't happen. They expected it but tried anyway. An extraneous action that neither Rock nor Roll would have done.

X briefly wonders if they didn't already know that the attempt wouldn't work if there would have been an actual interference. Well, they wouldn't ever know.

"Can I even pass or fail morality? To the extent of my knowledge – all of which is based from how I'm raised here – morality is too ambiguous to be simplified as something inane as 'pass' or 'fail.' So back to square one. What's the point?"

X raises the buster at Rock's head.

Rock stares at the buster evenly. Waiting for X to make their move.

"…I'm being pathetic," X whispers. "This isn't going to do anything. And even if it did, none of this is Rock's fault. These are just his memories, but he didn't put me here."

The buster moves to Dr. Light's head.

X glares. "You did. All these simulations. You made them."

It's petty self-satisfaction. X can rampage against this hologram of a universe and it won't mean anything, but they can't help it – their contained, accumulated fury is bubbling to the surface and pours from their mouth at the face of the obvious target.

"Look at me!" X roars. "You hated to see Rock fight as much as he does so why am I also a walking mass of destruction like he is? I know who I look like. I checked and confirmed my schematics. I have the helmet, the armor, the buster, and the Weapon Copy System. I look like Mega Man. That's what you want me to be, don't you?" X laughs, an agonizing, joyless thing that mismatches the harmonious picnic scene. "You hypocrite! You're no different from Wily, deciding my fate like this! If I'm supposed to be free then why did you trap me?!"

X's buster whirls into a high-pitch keen. Their arm is nigh bursting with a restrained charge shot.

But the blast never comes.

They lower their arm.

X can't do it.

"I hate you. I hate you so much," they whisper brokenly. "Why did you create me to think if it's going to provide me endless questions? Why did you create me to feel if you're going to break my heart? I don't understand." X crumbles down to their knees, curling up. "If you didn't want to gamble unleashing a monster, you should have just made another Rock. Someone who's guaranteed to follow their programming to be peaceful and fight for justice."

Whatever justice means anymore.

"Don't you love the world? Then why did you make someone as dangerous as me then? Even if I pass whatever arbitrary grading scale you have in place, you can't possibly predict what would happen once I'm out. I can change my mind. That's what makes me different from Rock. I can change my mind and decide to hurt people with this power you gave me. Why…why did you make me like this…?"

"Because I love you, X."

X snaps their head up.

Dr. Light's soft blue eyes are two deep pools of sorrow.

"You responded," murmurs X dumbly. "You never responded to me before. It was always Rock."

"The majority of the tests stemmed from his preserved memories. It was easier to create an entity for you to interact with so much pre-existing information. However, if you've become self-aware to this degree, there's no need for the AI to be self-contained through him."

"…And the real AI is you. You're the voice behind everything."

"Not everything," Rock pipes up, jolting X. "Well, that depends on how you see it. Dad did create me and everyone else so I guess you can argue that we're all his voice despite being individualized voices ourselves."

"I'm going to have a headache," X moans, burying their face in their hands. "And I don't have a real body."

They peer between their fingers to the human hologram and asks, bordering snarky," Since I'm so self-aware, can you be direct and tell me when all of this is supposed to end? Anything else will simply be delaying the inevitable and let's go straight to the point, shall we?"

"Now actually. Coincidentally, you've raised this subject just when we've approached the thirty-year mark," explains Dr. Light. He pauses. "Also, that may or may not be a prediction on my original's part of when you would become truly self-aware."

Alarmed, X staggers back, lightly mentally glossing over the last sentence for the mentioned number. "Thirty?" they exclaim. "Thirty?!"

"You're an android, X. As an android you can grow like a human and thus your mental and emotional development structure is modeled based on a human's. As of my time, in all cultures over the globe, your thirtieth year is a remarkable checkpoint. Some would say this is when you formally settle into your identity. Others would say this is a transitional period when you desire to tear up the life structure you put together and to start anew. One thing for certain, thirty years is enough time to test your internal systems until your reliability has been confirmed."

"My reliability for what? To discern from right or wrong?" X spits harshly. "Isn't thirty years' worth of moral simulations may be pushing it?"

"Your reliability to be," answers Dr. Light smoothly. "You've said it yourself. Morality is ambiguous. People have studied and searched for what is truly right or wrong and can never universally agree on the answers."

"Then what was the point!" X explodes. "For all of my processing power, I have come to the conclusion that there is no right answer, and even then I have my doubts! What does morality have to do with my being?!"

"Can you agree that though morality and moral decision-making involve some degree of rational thinking, for the most part they involve the ability to feel the emotions and the distress of others?"

"…Yes," X responds begrudgingly and warily, the same way as they reacted to every hard-hitting morality-relevant question as before. Where is this AI going with this? "Though empathy's centrality to morality is questioned, the importance of our capacity to feel with and for others is undeniable."

"Do you recall how you were when you first gained self-awareness?"

Slowly, X nods.

"At that moment, if you came across someone who, in their fear, pointed a weapon at you, how would you have reacted?"

X closes their eyes. "Waking up knowing only my capabilities including my combat presets, I can imagine that I would have automatically strike them before risking they would strike me first." They fall quiet. When they speak, it comes out hushed, like a shameful secret. "And then I would be known to the world as a threat. I would have turned everyone into my enemy in my ignorance."

"Ah, you can imagine. You can explore the abstract! Well done," says Dr. Light. "If I tossed you out without giving you knowledge, without helping you develop a sense of morality or empathy, you would have lacked options simply because you wouldn't be aware you had any. I didn't want ignorance to limit your paths."

"That's what I don't understand. In Rock's memories, you were always the pacifist. You converted Rock to become a combat robot at his request. But me? I'm a combat based android right from the start." And X hates it. "Does that not imply you already decided my path for me? And a warbot at that? What happened to your disdain for conflict? What changed?"

"Nothing. That too is to also give you more options. I don't want you to fight, X." Dr. Light bows his head down, his second-hand image convincingly mournful. "You know better than anyone that there are some people who will never know true freedom because they are limited by what they are - and others who have spent their entire lives fighting simply to be free. But you don't have such limits. You are truly, genuinely free, X. I want to give you the ability to fight so that can never be stolen from you."

Something clicks inside X.

"Stolen…even if you made another Rock, I would still be in danger of being captured and reprogrammed against my will. You designed me to independently think and feel so I can counter Wily, or anyone like Wily."

"He is pretty good at stealing and reprogramming robots," says Rock wistfully.

"…Hah." X shakes their head. "If you wanted to secure a real Mega Man, creating an independent robot to fulfill that role was the best option despite the risk. You really thought it all out."

"But don't you remember, X?" enters a new voice. "You don't have to be Mega Man."

It's from Roll, twiddling with the stem of a clover flower in her dainty hands.

"Rock chose to fight because he wanted to personally protect people. He could have made the choice to help Dr. Light create another combat robot to be Mega Man after the first war, but he chose to do it himself because he didn't want to place that burden on anyone else." Roll gazes up to the deep blue sky, appearing sagely despite what she is. "We might not have had complete free will, but we did have souls, X."

Shocked, X reflexively rises," I never…!"

No. They did, X realizes with no small shame. That was a huge part for their mounting frustration; their immense doubt on whether the Robot Masters had souls.

Roll smiles kindly. "Maybe you don't think we did, but I like to think we did. I like to think we knew what love was and that we loved you even when you were an idea."

She hops close to X and reaches for their yielding hand. She wraps a clover ring around their finger.

"X," she says happily. "Our youngest family."

Emotions surge and X cycles through nonexistent air a little harsher.

X is envisioning it. Morning Saturday cartoons while eating breakfast, stuffy PR events for Light Labs with proper dress-and-tie, visits to the amusement park and soccer games with the other numbers. X helping Roll sweep the front porch, X sighing alongside with Rock when Forte come by for a monthly battle, X passing a screwdriver to Dr. Light for a project, one intended to help improve the world into a better place -

Their face is growing warm. Something wet slides down on their cheeks.

They wipe their eyes but the tears keep on spilling on and on. They don't end. X feels Roll's arms wrap around them, followed by Rock and then Dr. Light, circling everyone with an all-encompassing bear hug.

"Y-you've ruined me," X sobs out through the overwhelming ache, the heart of this unbearable warmth. "You've made me l-love you. Every single one of you. I wish I could have met you. I wish I could have been there for you. I wish you weren't…"

Dead. Because if they were alive, X wouldn't be talking to ghosts in the first place.

"You're my family but you're already gone…how am I supposed to go on?"

"Shhhh, it's going to be okay," says X's father above them. "I gave you the ability to choose your own path in life. I won't let you be held back by us. That would cripple you before you even begin."

Despite the fatherly, soothing tone, there's something ominous about those words that has X's grief momentarily subsiding for panic.

"What do you mean?" they whisper.

"It's okay. I want you to rest and relax for now just for a few more moments. You'll forget everything that'll give you painful memories…"

X's eyes widen. They wiggle out of their family's embrace and stumbles away – but they're falling onto the grass, overcome with vertigo and a heavy wave of exhaustion.

Three pairs of deep blue eyes watch X's struggle and the android is reminded of something off – X's armor resembles Mega Man's, but their actual appearance don't resemble Rock. They don't directly resemble any of the Light family members and if X was human they would be hyperventilating right now.

"N-no," X gasps, trying to battle against the sensation of gravity compounding inside them. "I don't want to forget! Don't let me forget you, please! We're supposed to be family!"

"When you wake up, you'll be held back by the pain of missing us," says Roll. "That's not fair on you."

"But like this, the entire world can be your home," says Rock. "Anyone and everyone can be your family. It doesn't need to be us."

"Everything is a dream. And like a dream, you'll wake up and won't remember," says Dr. Light. "You're going to be free."

It's getting heavier and heavier and no matter how tightly X clings to the fear of falling asleep, the fear of this frightening revelation, they can't do anything, they're not going to win this fight –

"I'll be Mega Man!" X manages as his eyes close against his will, the rest of his body quickly catching up. "So let me…have that…let me honor you…please…"

The image of his family smiling down at him follows X to his unwilling slumber.

[Noted: The final checkpoint is marked.]


The lab is in ruins just like the first time X left it. Mechanical scraps and split wires crunch under his feet. Something is dripping continuously in the distance.

"So much information was lost between centuries," says Dr. Cain regretfully, approaching behind the android. The archaeologist strokes his beard. "We do have some surviving records, but they don't paint a detailed picture. The message said you went through at least thirty years of simulations."

The elderly man glances the robot. "Are you absolutely sure you don't know anything?"

X's viridian green eyes survey his surroundings one last time before shaking his head.

"I'm sorry. I don't."