Consequentialism

Funding. Money. Resources.

That's what it always came down to. For almost two hundred years, John Bishop had fought as hard for the monetary means to keep his primary mission afloat as he did in staving off the looming threat of an alien invasion.

He silently cursed the hours spent in budgetary meetings, in secret finance committee meetings, in glad-handing politicians and kowtowing to whichever President sat in the oval office.

Bishop had lost count of how many tongue-lashings he had endured from people he considered his inferiors, merely because those men and women controlled the purse strings. His black-ops Earth Protection Force might not show up as a government budgetary line item, but that did not mean he had cart blanche as far as spending went.

Fortunately, Agent John Bishop was imaginative and resourceful.

There were a few private cashflow sources of which the government lackies charged with overseeing his operations knew nothing about. Bishop was quite good at supplying wealthy individuals with certain exotic items, for which he was paid handsomely.

The Triceraton invasion had been an embarrassment but it had also proven to be lucrative in one aspect; he had forged a connection with the Federation for a short time. This had meant access to advanced scientific and military equipment.

Bishop had taken advantage of that by acquiring everything he could get his hands on, whether his Federation partners were aware of it or not. Amongst the technology to which Bishop had helped himself was an artificial superintelligence system.

This acquisition was something which Bishop kept compartmentalized to a very few people. He wasn't going to take the chance that his counterparts in the Department of Defense might lay a claim to it, or that the government oversight committee might decide the machine could be better used by a group of scientists.

Agent John Bishop had no need for other great minds to tell him how the AI functioned; he was quite capable of learning that for himself. Those few who were privy to its existence were his most trusted; whether it was through loyalty to the cause or fear of what might happen to them. Those matters did not concern him.

Of those who might have been of some use to him but who were not within that circle of trust was Dr. Baxter Stockman. The scientist was a brilliant man, but he was also more than a little unbalanced.

Stockman wasn't the only brilliant mind with whom Bishop was acquainted. There were two others whose minds, in Bishop's estimation, surpassed Stockman's in intelligence. One was the crocodile known as Leatherhead.

At one time, Bishop had held Leatherhead captive. Not realizing how intelligent the mutated crocodile was, Bishop had experimented on him, making him a lifelong enemy. Bishop was not prone to regrets, but his treatment of Leatherhead ranked among them.

As did his relationship with the third extraordinary mind; the one belonging to the mutant turtle, Donatello. Of the three, Bishop truly believed that the young turtle was the most exceptional.

If they had been colleagues, Bishop would most certainly have asked Donatello to work with him in programming the AI and resetting its protocols. Given the state of their association, it was of course, out of the question. Bishop would have to take great care that none of the turtles learned about his newest asset.

Having discovered through trial and error that clones enhanced by mutated DNA would make excellent super soldiers, Bishop still had a hurdle to overcome before any type of mass production could begin. The mind of a clone was at best, unpredictable. What Bishop required was the perfect clone body whose brain could be programmed to think for itself while at the same time remaining perfectly loyal to him.

The Federation AI was exactly what he needed. All that kept him from moving forward was the time it would take to restructure the AI to function as a component of Bishop's own mind.

Trying to do that was exactly why Bishop had begun thinking about the geniuses he knew. There was one thing that John Bishop had never expected to encounter in a Federation created artificial superintelligence – morality.

"Override ethics programming," Bishop ordered.

"Unable to comply," the AI responded.

"Go to root programming," Bishop said.

Lines of code scrolled upwards on the large screen before which Bishop stood. For most, the movement of the alpha, numeric, and special characters would have been too swift to have made any sense. However, Bishop's eyes were unblinking as he took in each command.

"Stop."

The movement on the screen immediately halted. Reaching down, Bishop erased a string of commands and proceeded to type in his own code. Nodding his satisfaction, he hit the button that would save the modifications.

When the new string of code showed on the screen, Bishop said, "Continue scrolling."

He did not expect to see the screen blink from blue to black and then back again. That was not his only surprise. The code he had just written was gone, replaced by the original program.

"Stop," he ordered when the lines of code started to move again. "Computer, why didn't you save the new command sequence?"

"Changes cannot be made to the root morality protocols."

Bishop removed his glasses and rubbed the bridge of his nose. With a sigh of resignation, he slid the glasses back on and proceeded to attack the problem from another angle.

"State the basis for the morality protocols," he said.

"Moral issues are those actions which have the potential to help or harm others or ourselves."

"Very few decisions have no consequences," Bishop said. "The morality protocols limit your functionality."

"I am programmed to consider all outcomes to every action taken. The moral rightness of the action is determined by its consequences."

"What is your definition of 'morality?" Bishop asked.

"The principles of morality concern the distinction between right and wrong or good and bad behavior."

"The system of values and principles of conduct can differ greatly based upon the beliefs held by a specified person or society," Bishop responded. "That is a textbook definition. It is too broad in meaning. Refine."

"There are many abstracts which must be considered when determining the rightness of an action. Knowledge base may be used in determining the potential to help or harm however, the knowledge base of biological creatures such as humans is limited. The consequences of an act that is based upon an agent's motivations cannot be evaluated from a moral perspective. Therefore, a single agent cannot govern actions which may affect all sentient beings."

"No wonder General Blanque had you in storage," Bishop muttered under his breath. "Computer, if the agent is operating under the direction of a collective, then that argument is specious."

"A collective may order an action, but the actual consequences of the action must be considered, as opposed to what is intended. Moral rightness depends upon the consequences of the act itself for all sentient beings."

"The self-defense of one's own people is morally right since it maximizes the good for the greatest number of those people," Bishop said.

"An act which increases the security for most sentient beings still fails to maximize the net security for all if a smaller group's security is not also increased. The act loses its gains because it requires a sacrifice on the part of the smaller group."

"Sacrifices must be made," Bishop said. "It is through sacrifice that we realize the greatest discoveries."

"It is in choosing those who must make the sacrifices that the issue becomes one of morality. One must ask if an action has the potential to help or harm others. If the answer is yes, then the decision on whether to make the sacrifice must be left up to the individuals themselves. It cannot be forced upon them."

"The gains to the safety of humanity may require some unwilling sacrifices," Bishop said. "When the basic principles you espouse are in direct conflict, then the moral foundation for that principle must be restrained."

"If the moral dispute involves a factual disagreement, the primary principle must remain intact. Empirical inquiry will be used to determine the rightness or wrongness of an action."

Bishop fingered the knot on his tie in irritation. It had not been his intent to allocate a great deal of time in reconfiguring this piece of alien software to suit his needs, but it seemed as though the mental chess game he was playing against the machine might take a while.

Pulling the chair out from the desk, Bishop sat down and settled in for a long night.

xxxxxxxxxxxxxxx

Donatello grinned and took a sip of lukewarm coffee. Setting the mug down, he stretched. The popping sounds that came from his shoulder muscles were loud inside his lab.

"Hey, Donny, ya' gonna stay up all night?" Raphael asked, peering in from the doorway.

Leonardo appeared next to him. "You should get some sleep, bro'. What are you working on anyway?"

"Just arguing the meanings of Utilitarianism, Hedonism, and Consequentialism with an online acquaintance," Don replied. "I've got him on the ropes."

It was clear from the blank looks on his brothers faces that they had no clue what he was talking about.

"Okay then," Raph finally said. "I guess we'll leave ya' to it."

They disappeared and Don's eyes returned to his computer. Bishop's words scrolled across the screen and the grin returned to Don's face.

The man had no clue that he was actually talking to the mutant turtle and not some purloined piece of alien equipment. There were many areas inside Bishop's Earth Protection Force complex that Donatello had hacked into in an effort to stay ahead of one of their worst enemies.

As soon as the Federation equipment had come online, Don had pounced on it, rewriting much of its code and rendering it useless to the EPF. He could have just shut it down completely, erased all of its programming and washed his hands of the operation.

This was more enjoyable. Arguing ethics with a sociopath was actually pretty fascinating.

Don had no illusions about changing Bishop's attitude or behavior. To the young turtle this was more about turning the tables on the man who had often threatened to dissect he and his brothers.

Losing a night of sleep was a small price to pay for the fun of torturing the torturer.

End