A/N: Takes place within Issac Asimov's "I, Robot" short story series.


Voyage Interrupted

Dr. Susan Calvin, robopsychologist for U.S. Robotics and Mechanical Men Inc and Alexa Tabola, the Safety Manager for the airline and shipping company AirTrans, walked through the back room of the airport concourse and towards a rear exit.

"I really do regret this incident" Tabola was saying. "We think the HAP unit will revolutionize the industry. Pilots are prone to error, and a single incident can cost the company a lot of money and headache." She was dressed smartly in a dark business suit that just screamed 'corporate'; the company security badge seconded the yell with 'big shot'.

"I would imagine the cost in human life and health would be far greater" Calvin added. She was dressed as ever, in comfortable working clothes that whispered 'whatever'.

"Of course; I didn't mention that because...it's so obvious. Oh, I have to talk to security here for a moment if you'd like to wait."

"I wouldn't, actually. You can catch up to me at the plane."

Dr. Calvin entered the plane through the main door via a stairway set up on the tarmac. The plane wasn't going anywhere; the damage to the undercarriage was too extensive for it to taxi, and even if it could the major portions of both wingtips that were missing would prevent it from flying. In point of fact, it was on this tarmac away from the main terminal simply because that's where it had been deposited by the transport that had lifted it from its landing site to the airport.

Immediately upon entering she was greeted by the pilot of the plane, Captain Joseph Hobson. While exchanging brief introductions, Tabola hurried across the tarmac and entered the plane to make her way toward the two.

"Sorry, I was delayed at the gate talking with security. I see you've already met Captain Hobson, Dr. Calvin."

"Of course. I thought that I'd go ahead and start my investigation. As I was just telling the captain, I've already gone over the official report of the company before I arrived. I'm here to see Happy first-hand and to interview the captain."

"Happy?" asked Tabola.

"Our name for the robot. HAP-E, or Helpful Automated Pilot Experimental. We tend to name our prototypes until they go into full production."

Tabola nodded. "We name our planes sometimes too, although this one didn't have one yet."

"Amanda" Hobson interrupted. "Her name was Amanda before we got the HAP."

"I wasn't aware of that" Tabola said with a mild look of scorn on her face for having been corrected. "Nevertheless, AirTrans is fully confident that robopilots will become an integral part of our service force. This is a setback, true, but one that will be overcome once we iron out the wrinkles. I assume you'll take the unit back to your facilities for more detailed examination."

"Eventually, I would expect. May I see Happy now?"

"Certainly Dr. Calvin," said Hobson "follow me into the cockpit; it's been open since the accident." The three filed into the front of the plane that traditionally held the flight officers.

Sitting at the left-hand controls was an approximation of a human. The robot had recognizable arms and hands that held the flight controls; its torso was built into the seat and the head even had a cap on it. The eyes had no sign of power and were completely lifeless.

Calvin explained. "When we first tested the prototype robopilot, we paneled some simulated passengers about how they felt about the mechanical captain. The first model was little more than a box with readouts, but humans felt uneasy being piloted without someone in the seat. Their acceptance changed when we retooled the robot to appear more human; people became more comfortable just because the robot LOOKED more like a pilot."

"We still keep a licensed pilot on the crew just in case they need to step in at any time" Tabola interjected.

"And it was a good thing in this case" Calvin continued. "There are quite a few people that probably owe their lives to you, Captain Hobson. What I'd like you to do" she said as she sat on a pull-down seat after affectionately patting the HAP "is tell me about the flight. I don't need all the technical details; I'm more interested in your interaction with and observation of Happy."

"Certainly," Hobson said "just stop me if I get too technical. Well, we were on a scheduled flight between Fairbanks and Denver. As usual, the preflight checks were completed and the passengers and freight were loaded. I sat in the co-pilot seat while the HAP taxied the plane and took off after receiving clearance. We left the Fairbanks airspace and had performed the major part of our climb when I unbuckled and went to go greet our passengers."

"Did the HAP unit function normally up to this point?" asked Calvin.

"I've taken four flights previously with HAP, and this one seemed no different; he seemed to be operating in a perfectly normal manner. He wasn't designed to be too chatty while flying, but he seemed to communicate with me and the tower clearly. I had HAP read off our majors and confirmed the values with my board."

"Majors?" Calvin asked with a quizzical look.

"Our major datapoints on the aircraft. Airspeed, heading, altitude and radar readings. Anyway, while I was out of the cockpit talking with the passengers the red emergency lights came on. I rushed back from the cabin and found the HAP locked up. It didn't respond to any commands and appeared deactivated."

"What happened when you hit the reset button?"

"Nothing, not even a brief whir, flicker or clunk. Of course the plane is programmed to go into an automated mode to maintain its vector until the human pilot can take over, so I sat in the co-pilot seat and took over the flight. Everything looked good except for the fuel level, which was critically low. Just at that moment I received a transmission from Edmonton that the HAP had radioed in an emergency. I started procedures for a reroute and throttled the engines back to maximum economy when they quit completely. At that point the gauge read empty. There were no airports nearby that I could reach, so I had no choice but to glide the plane into a thickly wooded area. At nearly the last minute I was able to see a logging trail and wrestled the plane onto it with a hard landing that tore off the wingtips and ripped off the wheels. We had to wait it out for two hours before help arrived; apart from a few broken bones everyone made it out okay."

"And of course they determined why the plane itself failed" she prodded.

"Yes, after the post incident inspection. There was a catastrophic failure of an O-ring in the body of the fuel tank which allowed the fuel to escape. The few failures we've had in that area typically lead to a small leak that will still allow you to get to another airport. The leaked fuel was able to drop out of the plane through a break in a body panel; it might have left a vapor trail, but no one observed any."

"So it wasn't a pump or engine issue" Calvin said as she put the picture together in her mind.

"Correct" Hobson confirmed. "The pumps worked fine, they just ended up with air to pump. And without fuel, naturally the engines shut down."

"Would the HAP have determined the leak sooner?" Calvin asked.

"Possibly, it depends. A flight typically has waypoints where a more thorough audit is done of the systems. With a slower leak there would be a deviation that would show up at a waypoint check. But with a failure of this nature, the whole event took place between waypoints and happened so quickly there was little warning." Hobson shrugged.

"Wouldn't HAP have seen the gauge falling after the failure?"

"Not at first. The system is programmed to ping after every quarter fuel load burn."

"We don't start the planes with full tanks unless it's needed" Tabola spoke up. "Too much extra fuel adds weight and costs more to fly. We allow a safety margin for weather and such, of course."

"Of course" Calvin responded. "You were saying Captain?"

Hobson continued. "I didn't have much to add. If the failure happened just after the three-quarter ping, nothing would be noted until a waypoint or the one-half ping. At that point HAP would notice the gauge was going down too fast and check the fuel management system for errors; we try to have dual sensors on everything so a failure on one wouldn't disable the data stream completely."

"Wait" said Tabola. "If the robot determined there was a problem, why didn't it call for help?"

"It did, to the Edmonton tower. But it didn't call for the Captain because it wasn't programmed for it" said Calvin. "It was designed to act as a pilot, not a co-pilot. It tried to solve the problem and couldn't."

"I don't understand, Doctor" said Hobson with a frown on his face.

Calvin explained further. "Happy was designed to fly the plane and make the best choices to get the passengers to their destination safely. The first rule of robotics states that 'A robot may not injure a human being or, through inaction, allow a human being to come to harm'. When it determined that the fuel would probably run out, it tried to formulate a plan of action that would harm no one and couldn't; there were no safe places to make an emergency landing. It was a no-win scenario."

"This was bad enough," she continued "as robots are incapable of choosing an action designed to minimize the number of humans harmed; it's all or nothing. But the programming of Happy included the fact he was first pilot, and as such more capable than Captain Hobson or any other human, no slight to you Captain. It was programmed to be duty bound to make a decision it couldn't. It froze up because it couldn't choose and couldn't NOT choose."

"Couldn't you just program it to make a value judgement based on how many lives could be saved by a particular course of action?" asked Tabola.

Dr. Calvin gave a stern look at her. "Basically, you're asking it to rationalize that killing some is preferable to killing many. By doing so you would create the same reasoning as an assassin; it would be moral equivalent to sanctioning murder if it would save more lives in the long run. Or to use the military analogy, a commanding officer ordering a soldier subordinate to him to a certain death to save additional soldiers' lives. US Robotics will have no part in that; the potential for abuse is too great."

"How about if you program him to consider himself a secondary pilot so that he would turn the flight over to the human pilot?" Hobson asked.

"That wouldn't work either" Calvin deduced. "Then he would come running, so to speak, with every little item because you would have the ultimate authority in flight matters and he would always defer to you. Would you want a co-pilot like that?"

"No, I wouldn't" Hobson said, smiling. "So will you be able to fix him? He's done a good job so far except for this incident, which I'm glad to say, is a one-in-a-million occurrence."

"Who, Happy? No, I'm afraid not" Dr. Calvin said with just a bit of sadness. "He had to act, but any action or inaction was calculated to cause harm. It burned out his positronic brain. He would easily choose to destroy himself to save others, and in the end did that in a roundabout way. When you took the controls after he burned out you chose a series of actions that ultimately saved everyone's lives."

"Maybe positronic brains aren't cut out for being pilots" Hobson suggested. "But the fuel issue wasn't his fault. Maybe we just need to tweak his programming, or his replacement that is, for this specific event."

"Some people aren't cut out to be pilots for that matter" Calvin countered. "What if it was a different one-in-a-million event? You simply can't program each individual possibility into the system; that's where logic and reasoning usually fill the void, not to mention occasional chance. Maybe someday we'll have a foolproof robopilot, but I doubt it; as long as there is risk involved there is a chance for failure. And the world will always have risks of some sort" she said as she prepared to leave. "I don't think I'd want to live in the sterile world that didn't."

The End


A/N: I was watching some kids flying a drone near my house and starting thinking about piloting and intelligence. Then I thought of the story of the Gimli Glider (search it on the web for a great true story) and how an AI might have handled it. I wanted to do an Asimov story and really liked the I, Robot story collection as well as the Elijah Baley/R. Daniel mystery series so I put robots and the Gimli incident together and got this.

In my family we always name our cars. I had a Dodge named Amanda.

An enormous THANKS to Turrislucidus for showing me a logical point I failed in due to my lack of understanding of an important technical aspect. Now the story should make a bit more sense.

Thanks also to Telluride for pointing out inconsistencies in name spelling and terminology.