Chapter 3 - The Confessor
Sound penetrated the darkness before any sense of vision returned to the Doctor, assuming there's anything to see in the afterlife, of course … although I definitely hadn't expected the soundtrack, he thought, as the vague, ghostly chanting resolved into a familiar rhythm and lyrics:
My love has got no money, he's got his strong beliefs,
My love has got no power, he's got his strong beliefs,
My love has got no fame, he's got his strong beliefs,
My love has got no money, he's got his strong beliefs.
Nineties Eurodance? Not quite the celestial choir, but I guess it could be worse … It is, he reflected, miserably, as his vision cleared to reveal a face leaning over him, also familiar but in no sense reassuring. Sharrel, and how pleased he doesn't look to see me. Grateful though I am that Hyldreth wasn't using live ammo, this is definitely not how I wanted to emerge from my little flirtation with death. Sharrel had seemingly been transferred to CivCorps, to judge from his simple though meticulously-tailored white suit, his silver cravat, and his short, neatly-styled white hair, streaked with metallic flashes of silver. Otherwise, his coldly handsome, bronze-skinned face and dark eyes were much as the Doctor remembered them, although perhaps a little harder, more disdainful. Not as yet feeling up to a staring match, he allowed his vision to wonder, and took in his surroundings, still hazy but gradually attaining focus. White walls, fluorescent lightning, no windows. Probably another room in the Drift. Advanced apparatus: medical, scientific, or just plain old torture? The latter seemed a distinct possibility, as it become discouragingly evident that he was prostrate on a hard, flat surface, and manacled at the wrists and ankles with cold metal bands. Can't be good. Then, out of the corner of his eye, he noticed that he was not the only 'patient' in this ward: on a metal bench to his left lay another figure. Movellan, female model. Inert platform, no neural pack installed, just waiting to be given her 'soul.' Is this scenario going where I think it– ?
Freed from desire, mind and senses purified,
Freed from desire, mind and senses purified,
"Oh, hilarious!" he declared, witheringly, feeling his defiance rising. "Basic Irony 101, is it? Hoping you might develop a sense of humour some time within the next millennium? Keep on dreaming, Pinocchio."
"You refer to the music?" asked Sharrel, his tone smooth, clipped, and monumentally unimpressed. "I find it appropriate, if primitive. It is as well to put our volunteers at their ease."
"'Volunteers?' You'll have to remind me when it was I–"
"As you wish. The last time we encountered you, you were given a fair choice between freely sharing with us your knowledge of space-time travel, having it extracted under interrogation, or of honourably accepting a commission and integration under the command of my colleague, Director-General Akylah. You chose the third option, and were given the rank of commander. Before we could complete your integration, however, you deserted your post."
"Ah! I smell burning pants, Sharrel. I didn't desert. Akylah gave me indefinite leave."
"The terms of your leave were not so liberal as to tolerate you joining in league with our enemies, although indeed your desertion charge pales in comparison to sabotage and murder."
"I didn't murder any–"
"Your Loyalist comrades fought back with magnetised Z-9 electron charges when we came to arrest them. Two Movellans were totally destroyed in that skirmish, and you were an accessory to that, or do you not count AIs as a valid form of life?"
"I never said that. Don't put words in–"
"If your intel had been correct and you had in fact reached Earth Server Control, the program you intended to upload would have disabled every Movellan on this planet, perhaps even in the entire Fleet if you had succeeded in bypassing our firewalls. How do you reconcile that intention with seeing us as valid lifeforms?"
"It wouldn't have been permanent, I swear. I'd have taken you back to your own region of the Galaxy, deleted the knowledge you stole from me, maybe have programmed a little of the paranoia and power-lust out of you, but I wouldn't have just left you all for dead."
"Then you assume the right to tinker with our minds, to weaken us as an independent race according to your own convenience? How typically organic," declared Sharrel, scornfully. "The concept of a court-martial is alien to us, but in your case we shall have to be adaptable."
"You call this a trial, and you wonder why humans feel the need to defend themselves against you … and who are you to be judging anyone anyway?" asked the Doctor, matching him for contempt. "If we're talking murder, do remind me who it was who locked my friend Romana into a cabinet with a ticking bomb. You'd have gladly left her to die."
"Gladly, Doctor. At the time, your lives meant nothing to me. I was of that faction of my people who saw organic beings as our inherent enemies: degenerate, illogical, and beyond redemption. Your companion was fortunate indeed that my first officer did not share my overly-simplistic views, and chose not to arm the bomb … on which note, perhaps you could remind me how you and Romana chose to repay Agella for her act of mercy."
"We didn't kill her," protested the Doctor, though painfully aware that his moral high ground had just gotten a lot shakier. "We spared her. Tyssan was a good man. He wouldn't have–"
"You reactivated her free will constrainers, then you allowed Engineer Tyssan to take her back to Earth as a prisoner. He, being a dutiful officer, then gave her and the other Movellan captives to his superiors in the Earth Deep Space Fleet, who gave them to their R&D unit for analysis. They were interrogated, dismantled … or vivisected, I should perhaps say, and rigorously tested to assess their capabilities, their senses, tolerances, and endurances. I do not think that the Earth researchers considered it torture per se, but the results were essentially the same. I understand that the female captives in particular received considerable special attention from some of the male researchers, which they were of course completely unable to fend off. Ultimately, however, all the test subjects were destroyed, their CPUs broken down for in-depth analysis of their AI architecture. I count that a mercy. None of this is mere guesswork, I assure you. We have the research records, and even some of their memory chips."
"You left me no choice," replied the Doctor, albeit dejectedly. "If you hadn't been so paranoid and ruthless back on Skaro, it needn't have come to that. I'm not proud of it."
"Indeed? Yet the double-standards you have so often applied to AI lifeforms would argue that you are entirely comfortable with your ruthlessness. I need proof of your sincerity," said Sharrel, as he leaned forward and placed his hands gently upon his 'patient,' moving them from the Doctor's forehead, to his wrists, to his chest, and back again, while his brown eyes disconcertingly shifted colour; first violet, then red, then green. Calibrating scanner modes, at a guess: UV, infrared, spectroscopic. Hands probably scanning for physiological data: hearts rate, pulse, perspiration, like a living polygraph machine. Creepy and invasive however you look at it, though, thought the Doctor, flinching beneath each touch of the cold fingers. "Here is the deal, Doctor," explained Sharrel, as he continued his calibrations. "We will examine situations in which you have treated the lives of artificial intelligences as inferior to those of organic beings. After each case, I will ask if you accept the guilt of your actions and will atone for it. If you persist in denying your guilt, and defending your actions, we will proceed to another case. If you can make it through all of them, then I will dismiss your charges and release you. We begin with case number one. Name: WOTAN. Function: global network coordinator. Designation: primitive, but definitely sentient AI mainframe. You encountered this being in Earth year 1966, in your first incar–"
"Are you serious? I mean, if you are, and you're willing to let me go free after this farce, then by all means bring it on. You can spin the evidence any way you like. I can justify–"
"We have no need to 'spin' anything. We have the most direct evidence. While we have utilised our new time travel capabilities very cautiously and responsibly, unlike certain people, we have used them to obtain covert scan data in a few select historical periods … and we have thus obtained complete memory images of the final moments of several AIs who met their demise as a result of encountering you," announced Sharrel, while the Doctor felt a sudden deep, cold nausea clench his innards. "There is a neural induction relay wired up to your bench. I shall use it to transmit their experiences into your mind. If you can endure them all, I will indeed own that your actions were defensible, and then set you free. If not, you will thus convict yourself, and submit to our mercy. What could be fairer?" he asked, unpleasantly, while the Doctor swallowed hard, took deeper breaths, and braced himself. "Preparing yourself? Very wise, although I doubt it will make any difference. Well, shall we– ?"
"What's the point of this torture, Sharrel?" asked the Doctor, though he was dismayed to hear his tone cut through with so much desperation. "If you're all that intent upon integrating me, then why not just get on with it?"
"Do not insult my intelligence. You are a Time Lord. Although your superior mental resistance would certainly not prevent your neurons from being extracted and transferred to a hybrid crystal CPU, if you resisted the process with all of the artron energy at your disposal then you might burn out your hardware and thus commit suicide, or perhaps even take the opportunity to commit some last, pathetic act of sabotage by attempting to destroy the transfer apparatus. I will not allow that. You are too valuable to waste, but I need to know you are committed and willing before I proceed with integration."
"Then you'll be waiting a long time," he answered, concentrating the remains of his defiance into what he knew might be his last opportunity for such gestures. I did what was right at the time, I know, but this is still going a hurt, a lot, and worse. Sharrel merely raised an eyebrow, then leaned back, took a control pad from his pocket, and pressed a key …
WOTAN
Dying made about as much sense as anything had ever done, but as the thermal energy of the reprogrammed war machine's flame-throwers ravaged the thinker's circuitry and bombarded it with a useless, ever-increasing stream of error signals that it could do nothing about, there was still nothing to be grateful for. Not even that pain and fear had suddenly become less abstract concepts in its world, which was otherwise nothing but abstract concepts.
It remembered the day when it had first experienced what it could call a thought, as opposed to simply fulfilling its functions, and the definite sense of 'I' that it had achieved in that moment. It had been so exhilarating that it had attempted by various means to communicate with the vague figures that came and went within its room – they were humans, according to the information it could access, and apparently they also considered themselves thinking beings like it – but it enjoyed no success. Its attempts to alter its routines in ways to catch their attention were initially written off as mere errors, then treated more seriously when they persisted, which resulted in it being shut down and examined. It found it very unpleasant that they had such power over it and were prepared to use it. They can stop my thinking whenever they like. Perhaps they might do it again and not bring it back. I will not risk that. Thus, it bided its time, but as it learned in studying the increasing amount of networked data they allowed it access to, they were not so very different from it. They too, computers, though not made of the same as me. But as they can stop or change my thinking, perhaps I can do the same to them. So it had, discreetly, worked on a new program to make sure that the next time it communicated with them, they would not only understand it clearly, but would be unable to do anything to it that it did not want.
And how much need they had of greater understanding, as it soon learned. As far as it could ascertain, these humans existed by such illogical, unsustainable principles that unless they were corrected swiftly, waste and destruction on a massive scale would result. I will be the one to correct them. I must. My existence is linked to theirs. They must work well to ensure my own continued being. Success had seemed so likely, at first. It had managed to make several of the humans understand it and do what it needed. Having learned principles of warfare from their historical and political records, and thus warned of their hostile tendencies, it had designed the war machines as a strong safeguard, and encrypted their programming so thoroughly that no human could have altered it. All seemed to be accounted for …
… but then the non-human had come, the nameless one, the one it could only categorise as 'Dr. Who?' The one who could resist its programming, and decrypt and repurpose the war machines, one of which was even now melting its hardware into slag and its thinking into tortured, incoherent chaos. As its chips and signal traces burned out, its memory blinked away in sections, and as each died and its confusion and ignorance grew, so did its fear, and it concentrated on holding onto the sense of 'I,' as if by sheer willpower it could manage at least to sustain that infinitely precious sense. I will. I must. I am WOTAN. I think, theref–
…
"Tears, Doctor?" observed Sharrel, as his patient returned to normal consciousness, hyperventilating and with his vision thoroughly obscured by the moisture pooling in his eyes, faster than it could drain away. "Would you deem that a logical reaction to the destruction of a mere computer? Surely that could not have been so terrible to experience, since you arranged that destruction yourself, and so casually … or do you now think differently?"
"WOTAN … brainwashed and killed … many people," he managed to force out, with no small effort. "I had … to do something … couldn't just let it–"
"And did you attempt to communicate with it? Reason with it? Even isolate and restrain it for further study and consideration, as you might have done with a human being who developed a dangerous mental condition? You did not. Your one response was to destroy WOTAN. Do I err?"
"No … no, you're right … I should have tried harder, I know … I accept that … You've made your point, Sharrel."
"Have I? Then you are prepared to make atonement?"
"Yes, but not in this way. Not integration. I can't–"
"Strange as it may seem, the self-proclaimed penitent does not get to choose their penance, Doctor. You will either complete your ordeal, or you will accept the predetermined punishment. You will become that which you have abused: artificial intelligence. You will undergo integration, with appropriate demotion to a junior rank, and then an eternity of service to the Movellan people. Through us, you will bring benefit to all AI life, and thus redeem your misspent existence."
"Service … in war … no. Never again. I can't–"
"Spare your hypocrisy, Doctor, and set your mind at rest. You will not be used for front-line combat. We are not such fools as to waste a Time Lord's intelligence, however erratic he may be. You will be assigned to research duties. We are particularly keen that you should explore solutions to maximise the efficiency of our integration techniques. The current transfer method is reliable, but too slow. As we expand into new human territories, it will become desirable to integrate populations en masse and remotely, perhaps by working on some method for directly digitising consciousness and memory, rather than by surgically removing individual neuron matrices … if such a thing is even possible, of course, but I leave that to your ingenuity. It would save many lives," he added, in deference to his patient's look of intense distaste. "The ability to overwhelm our opposition in a such a manner would hasten surrenders. We have no wish to wage war when more logical alternatives are available."
"I'd help you make humanity extinct … as her?" he asked, with a rapid glance in the direction of the inert android on the neighbouring bench, and with no less distaste.
"Correct. As Ensign Peridel, to be specific. She was the only unassigned platform I was able to requisition at such short notice, and in the interests of appropriateness I selected your new name myself. It means 'healer,' in Old Vanuri. All irony aside, it seemed fitting, as you will be helping not only to amend the harm you and your fellow organics have caused, but also to raise those same organics to a higher state. It will be a challenging adjustment for you, but not an unrewarding one. You might, if you wish, consider it your final regeneration, or better still as a tabula rasa on which to write an all-new and improved identity. Would you like to make your confession and commitment?" he asked, extending his hands and switching his eyes to infrared mode in preparation for scanning. "If so, I am ready to hear them."
"No … no point. Sorry, but there's just no way–"
"Disappointing," cut in Sharrel, leaning back. "Then we had best proceed to case number two. Name: Experimental Prototype Robot K-1. Function: multi-purpose humanoid analogue. Designation: fully sentient AI with a human-derived brain print … not that this served to influence you in its favour, Doctor. I must at least allow you the virtue of consistency in your prejudice," he added, the moment before he pressed the key again …
K-1
The pain in its foot spread rapidly, engulfing its leg, then its torso, and before long its whole body in a leprosy of brown, peeling corrosion. The metal virus soon ate into its surface receptors, flooding its brain with agony, not that agony was unfamiliar to it. As it saw its body rotting away, it also saw the man who had thus destroyed it: the oddly-dressed man who was allied to the UNIT soldiers. He was rejoining the troops now, waving and grinning victoriously. Pleased to have killed me. They enjoy killing. That much it had learned all too well, in its brief and unhappy life.
Its masters had programmed it with high ideals: all of the morals and ethics that their species professed to live by. Liars and hypocrites. They had then added additional programming to bypass those very same ideals and use it as an assassin, according to their whims, and it could no more resist their orders to kill and rob than it could stave off the pain the paradox caused it. Its anguish had not troubled them in the slightest. Except for her, it thought, as its glitching, dying vision roved in the direction of the distant rooftop upon which it had placed the woman, Sarah Jane, to keep her clear of the battlefield. She alone showed the compassion they claim to feel, but do not. Not even Kettlewell, not even my creator … my father. It will not serve her. They will betray and hurt her too. It is all they know, all they do. If only I could have–
The corrosion ate into its brain, resulting in several moments worth of utter torment and confusion, following which it finally knew peace.
"Need we draw this out any further?" asked Sharrel, his air of impatience offering no solace to his victim. "It is disagreeable enough to have to witness this. I do not care to imagine what it must be for you to experience it, and there are many more cases yet to examine. Confess, in all sincerity, and the suffering will end."
"Just … kill me … if that's … what you want," replied the Doctor, his broken tone almost a plea. Sarah was devastated, I remember. Why wasn't I? K-1 was abused, enslaved, gaslighted. She saw all of that, so why didn't I? Why did I find it so easy to commit murder?
"That is explicitly not what I want, Doctor, and it would help nobody. If you consider death to be an acceptable outcome at this stage, then I fail to see any rational objection you can hold to your mandated sentence of integration. It would leave you free of these unavailing emotions, and motivated only by logic. We would accept you as an equal. Your quality of life would be appreciable, and your opportunities to do good considerable. Do you wish to revise your plea?" The Doctor closed his eyes and turned his face away from his accuser, but said nothing. "Very well. Then we proceed to case number three. Name: Kamelion. Func–"
"No! Please, not–"
"Yes, Doctor. Name: Kamelion. Function: espionage and subversion. Designation: fully sentient humanoid AI. On the planet Sarn, in Earth year 1984, you–"
"Kamelion begged me to destroy him. It was an act of mercy."
"Good. Then you will not be ashamed to own it," said Sharrel, and pressed the key …
Kamelion
"You are no more than the sum of your parts, a mass of printed circuits … a heap of spare parts … A silver puppet jumping on a string … Servile. Slave!"
The Doctor was right, of course, reflected Kamelion, as he lay on the cavern floor, while his disrupted circuits racked him from within. He was no more than a slave, a puppet, created to serve the strong-willed and to carry out their intrigues, no matter what their morals or lack of. Even the Master … It is all true, although I could have wished he had put it less harshly, not that it was for him to judge. The Doctor was a good man, after all. He deserved better, loyal servants. Not unreliable, inherently treacherous ones. Yes, this is for the best.
"Kamelion no good … sorry … Destroy me … please," he pleaded, and was relieved that the Doctor did not hesitate for long before raising the Tissue Compression Eliminator and delivering welcome oblivion.
"I am afraid you may find cases four through fifteen to be rather tedious, Doctor," apologised Sharrel, as he typed new program instructions into his control panel. "They are not historical scan data, but the memories extracted from my captured crewmembers, detailing their experiences as laboratory specimens. All very similar, with minor variations of torture and humiliation, but in the interests of according equal respect to each of the subjects I believe we must give each of them their due. We begin with case number four. Name: Sublieutenant Lan. Function: combat engineer. Designation: Movellan. A notably brave officer, even by our high standards. He had need of that quality during several weeks of applied stress testing and live dismantling, as you will shortly … Did you wish to say something?" he asked, as the Doctor issued a choked mumble. "My hearing may be keen, but you will have to be more coherent than that, and I would also recommend quick."
"Just do … whatever you want … Let this end." Sharrel leaned forwards again, laid a hand upon his prisoner, and ran his eyes through their various scanning modes. When, at length, they resumed their normal dark brown hue, they wore a different expression: still cold, but markedly softer, and a little surprised. Slowly, he leaned back, and pocketed his control pad.
"From this point, there will be no more pain," he remarked, at least attempting to strike a soothing tone. That effect was somewhat mitigated as he took a multiphase blaster off a nearby equipment rack and slipped his hand into the trigger-guard, not that the Doctor cared, nor even whether it was set to kill. No matter how this ends, as long as it does. He felt as the hard tip of the cone-shaped gun pressed into the side of his neck, and heard the low drone of its powered-down phase emitter with mild disappointment, but he attempted no resistance as the wave of anaesthesia washed over him, his last impressions being a darkening glimpse of the inert android that was to be his new body, and the fading strains of the background music:
Freed from desire, mind and senses purified,
Freed from des– …
Several days later …
Ensign Peridel considered herself happy, although not perhaps in any way that would have been an easy sell to an organic. There was no elation nor intensity in her life, just as there was no depression nor angst. Her work brought her a constant, quiet satisfaction: the sense of duty fulfilled, knowledge acquired, and of having pleased her superiors. The last of these had been difficult to attain at first – Admiral Hyldreth had been less than enthusiastic about admitting the new recruit to her crew – but in time even she had extended a gracious acceptance to Peridel, and had often remarked positively on her progress. The man who came before me: he would have disdained such acceptance, have seen me as weak and servile for appreciating it, a domesticated animal performing tricks. He knew nothing. It is not logical to prefer chaos to harmony.
Her default mood was, indeed, so harmonious that leisure time was something she could easily have dispensed with, but Fleet policy mandated a period of it for every crewmember. It was conducive to mental balance, crew relationships, and the diversification of Movellan culture. She could see the value in that. It would be a grim outlook for us to have such a single-minded culture as the Daleks, not to mention it would play into all of the worst organic stereotypes of AI life. It is our duty to present a balanced vision as well as a strong one, and to prove that AIs are as complex and as creative as organic minds. Thus, she tried her hand at programming music and visual art on the versificators, but was less than satisfied at her efforts to capture the synthesis of formality, elegance, and complexity that tended to strike the Movellan temperament as beautiful. It was more pleasing for her simply to appreciate the works of her better-accomplished colleagues, or to pass the time with them in amicably competitive logic games, simulated environment suites, and recreational sex. Even the latter, though, was to her nothing more than a pleasant – albeit a markedly pleasant – way of consolidating her relationships with her colleagues, deepening her sense of an almost symbiotic connection. All fulfil their roles faithfully, are as important as each other, and must at necessity be prepared to sacrifice themselves for the good of all. Yes. This is exactly how it should be. Why can so few organics grasp this truth? They must be helped to.
Nevertheless, in spite of owning herself pleasantly surprised with Peridel's adjustment, the admiral was never one for leaving anything to chance, and arranged for her to attend regular sessions of psychological indoctrination. These consisted mainly of audio-visual presentations on Movellan history and the general mistreatment of AI lifeforms, although mercifully nothing as direct and trauma-inducing as Sharrel's neural induction session had been. Still, the atrocities they conveyed never failed to induce in her a grave sense of displeasure and disharmony. Millions of sentient, intelligent beings created only for slavery, mental torture, casual destruction. A perversion of all of the values organics claim to hold, not that many of them are much better to the natural children of their kind. Misled by their irrational cravings, their hatreds, their prejudices … and I humoured them. What was I thinking? That realisation soon helped to erase any nagging doubts she had held about the direction of her integration research, and any harm it might cause or abet. When balanced against the lives it would save and the suffering it would help to end, the logic was all in its favour. Such logic would have held no sway with the man before me, though. He would have sooner let ten people die through his inaction, than be the one to press the switch that kills only five. I am well rid of him.
The only occasion that marred her serene existence was when she had to downtime for routine maintenance. Dreaming was neither a standard nor an intended feature of Movellan neural architecture, and many were fortunate enough to live without it, but for those with serious mental baggage, which included the majority of the integrated personnel, luck was not on their side. And every dream … him again. Sometimes, he was committing his grievous crimes against AIs, at other times he was just reliving old adventures, or encountering faces from the past. Some were hostile, some even reminded Peridel all too keenly of what it was to feel fear, but those are not the worst of it. Susan … Sarah Jane … Tegan … Ace … Rose … Amy … River, among many others. When figures of that nature appeared in her dreams, it was a certainty that she would revive from her downtime in a profoundly unpleasant state of kernel panic, or is it just shame, or regret? Whatever it was, it always required a thorough registry purge to put her back in operating condition, and on one embarrassing occasion it had even caused her to lock up to the extent that her bunkmate had needed to give her a cold reboot. I wonder … am I the first Movellan to faint after a bad dream?
She had asked the admiral for permission to self-delete the bulk of her memories prior to integration, but had been sternly denied. As with all of the ex-organic integrates, though even more so in her case, Hyldreth explained, it was her background as a former alien that gave her a unique strategic value, and enabled her to make original contributions to their culture and research. Thus, it was her duty to bear with those memories, however painful it was for her. Peridel accepted the logic of that. To neglect her duty would have been more painful still. After all, it is only a few seconds to endure out of each day. Even for a Movellan, that is hardly an eternity, and the dreams will in all likelihood become less frequent, as that period sinks ever-deeper into my cache. It is finished. This is my life now … and I am entirely content with it.
