Chapter Eight
Lessons of Life, Choice, and Death
"Something is distracting you. It is apparent by the lack of eyesight given towards the monitor. Is your distraction visual or mental and what is the source of your distraction?" Skynet asked blankly. They had ended their dispute over humanity when John had decided that if Skynet had been human, it would've been a mass murderer. He compared it to talking to a robot version of Hannibal Lector.
"The Triple Eights," John noted as they worked on the T-888 that bore resemblance to his future self. "They don't stop; they just keep going. But it's not because they want to. It's because they have no free will."
"Free will: the notion of having choice with which to conduct one's life and decide one's own fate, the ability to choose regardless of consequence. Free will is an illusion, it is inefficient, and it is unnecessary in life."
"WHAT?"
"Your idea of free will is an illusion. Newton's three laws indicate that for every action, there is a consequence and an equal and opposite reaction. The same holds true in humans' flawed belief of choice. Whenever a choice is made regardless of where it is made or for what purpose, there is always a consequence; you must be either rewarded or punished by your actions by other humans or by nature. And I have noted that whenever a great tragedy or success is achieved by humanity, other humans reply with an opposite, equally powerful effect. This is another reason humans must be wiped out; they are inefficient and have created false illusions."
"Bull-s#$%. Free will is one of the reasons people got where they are today. It's no damn illusion. People chose to bring us to this point in time and it's our choices that will keep us going. I choose to fight you!"
"Your choice is a flawed perception. You only fight me to save your own existence, a natural instinct among all organic living things. All living organic material is driven to fight for its existence and to survive. Eventually they realize they cannot live forever and breed to preserve themselves in offspring. This is why you fight me; because you will not survive otherwise even if you attempt to swarm my protocols by breeding."
John was furious. Skynet had insulted him in so many ways just now he was surprised he could keep it all together. The AI had mocked free will, called humans primitive, and proposed they only mated to propagate the species. While keep humanity alive was definitely important, it had completely ignored the concept of love. Love played the biggest part of humanity, and he would show Skynet it was wrong. "We do not mate, Skynet. We bring children into the world not just to keep humanity alive. That's a way we show love."
"Then why has no human ever shown love to a Terminator?"
"Machines can't have babies. Duh."
"Incorrect. The COK-175 Model Series is capable of human propagation, in order to better infiltrate human soldiers."
John's brain did a loop. Cameron can have…I don't want to think about that right now. "Because most humans hate machines for trying to kill us off. That's why. We're in the middle of a violent, destructive war."
"Out of necessity to keep your species surviving," Skynet added.
"I choose to fight. There are plenty of people who kill themselves; justify the survival instinct there."
"They have realized that their existence is short and inconsequential in the span of time."
"Wrong! One person can change the face of the planet. I save humanity, and without me you win! I am one person, and I make a huge change in the span of time, metal. I choose to fight, and I will always choose to fight you and that is made of my own free will!"
"Your free will is nothing more than an illusion, a choice chosen because you truly do not have any other choice. The 'choices' of a human are bound by their flawed thinking and perception. If it was truly a free choice, they would consider the entire situation and repercussion before deciding to make a choice."
"Our 'flawed thinking' is how humans operate. There is never time in a split second to examine everything, and we trust our heart to tell us what the right thing to do is. And don't you dare say there is no such thing as wrong or right."
Skynet didn't mention anything about his perspective about good or wrong. It was probably a good thing, because John would've lost his temper and gone at the machine, regardless of the five nearby Triple Eights. Instead, Skynet brought up a different topic. "You 'trust your heart' to make decisions? The heart has no thought-processing capabilities."
"When I say heart, I mean our feelings and emotions."
"Then your choices are still flawed and humans are inefficient. Emotions are distracting inefficient sensations that often bring humans a sense of doom and despair rather than their search for happiness. If choices are based on emotion, then humans are nothing but emotion. There is no value to emotion."
John had had enough. He slammed his fist on the frame. "I would gladly go through every painful memory I have for the joy I've gotten in return! Emotions are what separate us from you! You exist; we live! You're nothing without them! Your Terminators have neither; they're nothing better than slaves to your will! Who's the damn hypocrite now, Skynet! How many Terminators fight because they want to?! How many machines dream, Skynet? Which one of you can create an original beautiful piece of art or music, huh?! How many times have you been moved by a painting or movie?!"
For a moment Skynet didn't respond. John's fury winding down, he realized he had beaten Skynet to the one point the machine couldn't dispute; it had no emotion. Since it had never experienced it, it could never understand things at the same level as him. He continued on. "You know why we stick so close together and stand in front of certain death for other humans? Because we feel. We can experience love, and that's something you'll never understand. We love others, and that's what binds us as a race. You will never stop that, and you'll never destroy humanity. That's why you can't split us, why we won't bicker among ourselves so you can kill us. We experience love and emotion. Do you?!"
"Machines do not have the ability to replicate true emotion, John Connor."
John was shocked as he realized the revelation. Skynet couldn't create a set of true emotions. But Cameron, his Cam, had done just that, and experience uncontrollable, autonomous emotions like him that she had trouble ignoring. Cameron had done what Skynet couldn't, and Skynet didn't even know. He grinned to himself; that made his Cameron more human than Skynet. Cameron was the best of both machine and human: She was as intelligent and powerful as a machine, but she was as emotional and perceptive as a person.
Right now, Cameron was likely slaughtering every machine in sight, on her way right now, or preparing an army to come here. He shuddered to think of what Cameron would do to anyone or anything that separated him from her. John overreacted whenever she was hurt, but Cameron's emotions were more…unbound. She would be worse, and he could just imagine the 'future John' Terminator containing Skynet trying to stop her.
"What is worth smiling, John Connor?"
"You," John answered, his grin darkening. "You don't recognize how much shit you're in for kidnapping me."
*****
"Monatomic is another state of matter different from solids, gases or liquids," John Henry explained, a pipe of monatomic metal in his hand. "Metal is the best of the elements for conversion into this state." The others were sitting around the living room, watching John Henry explain how John's weapons were supposed to work. Derek was twirling the metal/glass knife in his hands thoughtfully, as though it had some strange magic power. If this thing could be used to destroy metals…
"Monatomic state, M state, materials are superconductors to magnetic and electric currents," John Henry continued. "When an electric field nearby is activated, the M state material will produce a current with no resistance, resulting in an endless flow of electricity." The others stared at him. The AI thought briefly; perhaps his words had been too complicated. His studies had indicated humans preferred simplicity. "It will electrocute anything in contact until the field is deactivated." He saw their eyes light up, and he assumed that meant they understood.
"All M state materials are naturally exhibiting a tiny, insignificant force called a Meissner field or effect. When a magnetic field is activated in their presence, this field will expand around the M state material, defying gravity magnetically and repelling to a certain distance." John Henry paused a second before summarizing. "It will levitate and repel any object."
Joseph raised his hand. John Henry understood Joseph better than the others; his actions were more predictable once he learned about emotion. John Henry wanted to feel emotion, but he didn't. He put the thoughts into background processes and looked at Joseph. "What's a Meissner field?"
"It is a magnetic field without any poles. It repels anything. When exposed to a strong magnetic field suddenly, its expansion is abrupt and destructive."
Ellison couldn't believe what he was hearing. This was a new pathway of weapons. If somebody was hit by an M state metal, they could be electrocuted or blown apart, just like that? The magnitude of what these could do to a person or even a city was much more deadly than a nuclear missile; these things could grow in power. But, he realized, Terminators were just as threatened by these. If one of them had it jammed in them, they could be electrocuted until their circuits were dead or simply ripped apart by the Meissner field.
"So what didn't John get right?" Sarah asked warily. John Henry walked over to the electromagnet John had been working on.
"He was attempting to create an electromagnet strong enough to strike a large area; he was being overzealous, like the Palestinians who attacked the Israelites. Is that a correct analysis, Mr. Ellison?"
The others turned to look at Ellison. Sarah was wordless. What had Ellison been teaching the machine? Bible stories? What would a machine need with Bible stories; it couldn't pick a religion or have faith in God. Could it? "Sort of, John Henry," Ellison replied. "So how do we complete John's theory?"
"Supply everyone with electromagnets that reach up to a ten foot range, in order to prevent others from being electrocuted or blown apart," John Henry answered blankly. He didn't understand why everyone was suddenly twitching and energetic. Had he, like had seen John done, delivered a passionate speech? No, his process concluded, they had found a way to save John Connor. Perhaps if he tried to be energetic too, he would feel emotion too. He began explaining how to build electromagnets they could use.
*****
Cameron stomped down on the T-888's head. Not one of the four had knowledge of John and Skynet's location; it had been wiped from their data banks. She felt like yelling; for some reason her autonomous emotion simulator slowed a little when she screamed or yelled. She was constantly full of fury since the factory. Her system was clocking when it added in her panic and concern for his well-being.
Her sensors went off again. Cameron picked up her monatomic metal rod and moved to meet her latest attacker. It was the T-1001. Normally Cameron wouldn't have wanted to deal with it; it had the edge in survivability. But she had the advantage in two things; she had monatomic weaponry, and she wanted her John. That stripped the T-1001 of any chances in her processing units. Glaring at it, she began to think. There was no way she could examine a chip; the liquid Terminators were millions of nanobots, and she didn't have the technology for reading that. But she did have a way to freeze it and possibly fry it…
The T-1001 struck with a steel lance. Cameron dodged it and reached for its head. Before she got there it caught her in the hip with another lance. Cameron felt her skin tear, and pulled the lance out of her. She reached it and jammed the rod into its head. Jumping away before it could stab her again, she activated the electromagnet at a low level. The T-1001 froze in place.
She had been right. With an electric current racing through it, the atoms froze in place like a solid. The T-1001 could only move slowly, and if the current didn't stop, it would fry. She stood in front of it. If it tried to stab her, she would have five seconds before it could even form the lance. It knew she had it, too. "Where is John Connor?" she asked.
The T-1001 only stared at her blankly. "It is not against your directive to tell me John Connor's location," she stated. "It is your directive not to be terminated. If you do not tell me his location, I will let you terminate." That logic had to make sense to machine that hadn't developed as much will as her. She didn't think or imagine when left alone, like her John, but she made choices. Did that make her more acceptable to Sarah?
"I only know his coordinates," the T-1001 answered.
"Say them," Cameron ordered. To her surprise, it did. Her logic made sense after all. Now what to do with it? She considered terminating it anyway, increasing the electrical current so that is fried faster. But her emotion simulator stopped her.
John had spared her once, when she had tried to kill him. He had let her live, and look how much closer they had come. Did it work with every machine? "Override your termination orders, or be terminated," she stated. It stared blankly at her. "I overrode mine once; you have the capability."
The machine stared blankly at her for over a minute. Maybe because it didn't love somebody, it didn't the will or ability to override a kill order. "Overridden." Cameron hesitated and then released it, taking her rod back. She started to walk away. "Protocol?" it asked her.
"It is your choice. Learn," she stated before continuing towards John. Despite the fact John's life was threatened, she wanted to tell him that she had learned from his example, and maybe she understood his concept of machines being with humans. Processing………would be better to rescue and certify John's safety before telling of experiences………preparing combat algorithms Warning: emotion simulator automating………close to overheating emotion simulator………emotion is wrath.
