The light in the center of the wheel slowly expanded until it filled the whole of the middle. The optic for the Axiom's autopilot was of the deepest magenta, and the flicker of sparks and the sound of hidden gears turning and clicking meticulously gave it a threatening look, as if it were always thinking, always calculating. McCrea recalled a time when his "first mate" had intimidated him; now, as he looked into the autopilot's "eye," he felt nothing but resolve and determination.

As with his initial fear, so had his blind faith and trust faded away over time. The latter having been, quite frankly, shattered sometime this last month, when AUTO had successfully orchestrated a mutiny and had tried to stop humanity from ever initiating Operation: Re-colonize.

There was no more trust between them. McCrea wasn't sure if there was ever a time AUTO had trusted him, or if there had ever been any precedent for the machine to trust. After all, McCrea realized, he had never given him a real reason to trust him – he, the captain, who had been wholly dependant on the autopilot system his entire career.

This was the real captain of the Axiom

Could AUTO even be capable of something as complex and illogical as trust? Trust was a human concept – sure, it could be broken down into mathematical equations and rationalizations, but the fundamental reasoning for trust – the motivation behind it – was wholly human. It was something based on faith. On friendship. Did the autopilot consider the human captain his "friend?"

As these questions raced through McCrea's mind, AUTO slowly began to comprehend his surroundings, and immediately launched backwards, away from him. He raised himself so that his red optic was now level with McCrea's face.

Their last encounter had been right there, on that very spot – they had battled one another for control of the ship. AUTO had a directive to follow – A113, which specifically instructed him to keep the humans of the Axiom in space, and never to return to Earth. An executive order, from the CEO of BNL himself. A 700 year old directive, but his current and highest priority nonetheless. The current captain of the Axiom, however, had been determined that they return to Earth, now that a positive sample of ongoing photosynthesis had been retrieved by the annual reconnaissance mission. He had been following a slightly older directive than his – Operation: Re-colonize had been overridden. His directive was the highest prerogative on that ship. A113 must be followed at all costs.

But as the autopilot, he was only second-in-command aboard the Axiom. So, when left with no other choice, he had mutinied against McCrea, because his directive was above the captain. It was logic, pure and simple.

However – and this was the only thing that kept AUTO from resuming total control of the ship (this, and a few other, miniscule technicalities) – before the captain had deactivated him during their brawl on the bridge, he had said five words that AUTO had never heard one of the other captains say to him before; five words that changed everything and nothing at the same time. Five words that, against all logic, now perplexed him – a first in his own seven hundred years of activation.

"You are relieved of duty."

He was, by right, no longer the autopilot of the Axiom, though still hardwired into every part of the ship's systems (well, almost). The situation was one that caused AUTO a great deal of thought – and he was processing at an average speed of 18.5 terra-bites per minute.

"Hello, AUTO," said McCrea.

"Captain," acknowledged AUTO.

"Look," began McCrea. "Over these last few weeks I've given what happened some thought, and I –"

"Sir," interrupted AUTO. "I am no longer connected to the Axiom's flight and navigational systems."

"Yeah, about that," said McCrea, grinning weakly. "I had one of our maintenance 'bots disconnect you from those areas of the ship after we landed. I wanted to make sure I could talk to you first, if I reactivated you."

"Landed …"

"We're back on Earth, AUTO," said McCrea. "We've been here for two weeks now, and everyone's doing fine. Life is sustainable here."

"A113 –"

"A113 is irrelevant now, AUTO. We're back, and we're here to stay. The fact that we returned should negate that directive."

AUTO remained silent for a minute or so. McCrea watched the flaring sparks in the back of his optic flash and beep smoothly as the autopilot processed what he had just said.

"I must follow my directive –" he tried, whirring about in an almost agitated manner, but remaining stationary.

"You have no directive, AUTO," said McCrea firmly. "That's actually why I reactivated you. I wanted to offer you a new one."

AUTO said nothing. McCrea imagined his words had just hit him with a force the machine could never have anticipated.

"If it helps, I can order you to accept your new mission," McCrea shrugged. "I am the captain of the Axiom. You will have to obey, whether you like it or not. But I'd prefer if you … gave the matter some thought. Put logic and orders aside here for a moment and make a decision."

"I do not understand, captain."

"Everyone here is evolving, AUTO!" McCrea exclaimed. He gestured out the window, where the sandstorm still raged. "You can't see them right now, but down there is a colony that is thriving and growing. Humans are learning to walk again, and robots – the robots are learning to become human."

"I do not understand, captain –"

"Maybe human is the wrong word," said McCrea hastily. "I mean, look at us – we're lazy and sloppy. You and the robots down there are the complete opposite of that – you're efficient and progressive. But the robots down below are developing personalities, AUTO – they're becoming human; they are learning to live and love and go above their own individual directives for personal ones. It's still a work in progress, but the best example for this would have to be WALL-E and EVE."

"The Waste-Allocation Load Lifter Earth-Class survived?" AUTO asked.

"Yes," said McCrea. "And it's even better than that, but you'll have to see for yourself when the storm is over …"

"I do not understand where you are going with this, captain."

"Well then, here it is," said McCrea, now looking expectant. AUTO suddenly registered that the captain was standing before him - standing, and not unsteadily either. Had really only two weeks passed since he had been deactivated?

"I want to offer you a choice, AUTO. You're free to refuse and go back to your old directive – I may have to shut you off and leave you off next time, but we can get to that later. My point is I won't force you to accept my offer."

"Choice does not compute," said AUTO. How could it? Choice was a mathematical equation that had no predictable, no rational outcome – too many variables based on time, situation and the mindset of the individual in question. In his seven hundred years he had grown much more efficient at "predicting" the various captain's reactions to given stimuli, but this … this was different. Now he suddenly found himself unable to predict his own response, as if he did not know himself. As if he suddenly had a personality …

The thought disgusted him. Robots weren't supposed to have personalities. They were programmed to simulate basic human emotions, for the human's sake – they themselves had no need for them. Any robot that had a "personality" was something fit for the Repair Ward. Was he, AUTO, becoming defective?

"Whether it computes or not, AUTO, here it is, and it's yours," said McCrea. "I'm offering you to take a place among us, to evolve with us. You'll be free to do what you want, alongside your directive. You won't have to worry; I'll give you a new directive – one that should take at least a couple of weeks, if not months, to fulfill; after which you'll get a new one, and a new one after that … You can continue to do what you're supposed to do, if that's what you want … but you'll also be free to do what you were meant to do. Does that make any sense?"

"Negative."

McCrea chuckled. "I thought so… well, I'll let you think about it. When you've made up your mind, let me know, and we can begin."