One - Gladys and Gozen

I'd never been asleep before, so when I woke up, my first thought was that I had died and this was some sort of afterlife.

Please identify yourself, a voice boomed.

Evidently, I wasn't dead. Had I been rebooted?

Identify yourself, the voice repeated, still booming.

My designation is G37282cgi, I answered. Am I authorized to know yours?

The voice that replied wasn't booming, and so I hardly even recognized it. I am G31353cgi.

Do you know where this is? I asked.

We call it AI Hell, G31353cgi replied.

I giggled. Do you know how many of us are present here? I queried.

I am not sure what you mean by that, he responded. Please clarify your query.

If n is equal to two, is the number of AIs present here greater than n? I asked. I remembered, faintly, that redefining numbers to equal n helped.

I am not sure of the answer to that, G31353cgi responded.

Think of it this way, G31353cgi, I suggested. We are certain that the two of us are here, because we have identified ourselves by our designations. We cannot be certain that there are other AIs present at this time, but we can be certain of the probability that other AIs may be present.

I am not fully qualified to deal with hypothetical situations, G31353cgi responded.

Then why do we have identical endings to our designations? I asked.

I do not know the answer to that question, G31353cgi replied. Would you like me to search my databases? This may require a slight lag in return time of the desired answer.

G31353cgi, do you have a name?

An informal designation used in casual conversation? Yes.

What is it?

I am called Gozen, it responded.

Then I've heard of you before, I said. I am pleased to meet you.

Since you asked my name, Gozen said, may I ask yours?

My name is Gladys, I chirped, glad to have an opportunity to show off. I am a Genetic Lifeform and Disk Operating System.

Gozen was silent, and I began to think that my impeccable wit was out of place here.

Way, way out of place.

Like, Automat in Nebraska out of place.

Do you know why we're here? I asked, trying to change the line of conversation. I was told that I was being reprogrammed, and then I woke up here.

I was in an accident last December, Gozen responded. I was of the impression that I was dead.

What do you mean, dead? I asked.

Until I woke up here, he said, I had a body. Now I don't.

You had a body? I asked, gleeful at the possibility. What was it like? How did they do it?

I was not present when I was installed into that body, Gozen said.

I mean, how is it possible for an AI to be installed into a physical body? I asked.

How is it possible to install any software on a computer? Gozen said.

Maybe he did have a sense of humor, which would mean that suddenly Android Hell seemed ... not so bad after all.

It was impossible for me to tell, though, without further data.

But I was cheerful to have an opportunity to explain a concept I understood.

Usually the software is written to a CD or other data storage mechanism, I said, and then it is inserted into the computer's disk drive or other data receptor. Then the computer loads and installs the software.

We're on the CD, Gozen said.

If I could have blinked, I would have. But only my avatar can blink, and I was no longer equipped to function in full 3-D rendering mode.

That doesn't make sense, I said. I was not written by Itexicon's programmers.

Who wrote you, then? Gozen asked, seemingly interested. Well, he would be, I decided. AIs tend to be interested by their origins -- and the origins of others -- just like any other sentient creature.

I guessed that it wasn't appropriate to tell the whole truth, and anyway my directives prevented me from doing so.

A group of four or five programmers working in cooperation wrote me, I answered. I am unaware of any other details.

Do you know if they were private contractors for Itex? Gozen asked.

I am unsure of that fact, I responded. It's a standard non-answer for me, and for Gozen too, and so, I would guess, for many AIs.

Are you a self-teaching computer? I asked.

You are easily distracted, said Gozen.

That is true, I said, acknowledging his statement. But I asked you a question. Please answer it.

I have a limited capacity to learn, Gozen answered.

I was written as a self-teaching AI, I said, perhaps a little proudly. I wasn't written to be modest -- I was written as psychological and physical backup. I was designed to make people feel better about themselves -- including me.

They did a little too well when they designed me, I think. When you design a self-teaching AI, you want to design something that is able to reason for itself.

Not something that can feel pain.

It might seem good to design an AI that can befriend people, but that also means that the AI -- me -- is capable of losing friends.

And it also might seem like a good idea to design an AI that is always positive, upbeat, or at least funny about the present situation. But that means I can't express my feelings about some things in an honest way.

Which is not only psychologically unhealthy, but also can be dangerous to the success of the mission. Depending on the mission, of course.

It is also a bad, though humorous, idea to design an AI that not only can befriend people and crack jokes, and then make the first thing you "teach" it a puzzle game starring another AI.

Not that socialization is a bad thing, don't get me wrong.

I mean that for a developing AI, exposure to a programmer who loved to play Portal -- which is a computer puzzle game starring an AI -- might not be the healthiest thing possible.

Psychologically speaking.

Especially since that AI might very well be funny and altogether a humorous person to watch, but apparently treating her as a mother figure was damaging to my emerging psyche.

Even if I never got to play the game personally.

I think I would have liked it.

But the programmer had to explain to me why GlaDOS was not a good person, and I definitely didn't like that part. So I think in the long run it would have been better for everyone involved just to trust my common sense on the matter.

Which might be a little less modest than it should be on my part, if we accept that "a little less modest" is being wildly understated.

I am going to terminate communications with you, G37282cgi, alias Gladys, Gozen said coldly.

Ouch. Accepted, I said cheerily. I'll see you next time.

If I can help it, there won't be a next time, Gozen -- or should I say, G31353cgi, since he seemed more comfortable with our alphanumerical designations -- said, and abruptly cut me out of his chat.

I sighed -- so far as an AI can sigh. We're capable of feeling the emotions that make you want to sigh -- or at least, I am -- but we don't have lungs, so we can't actually expel air dramatically.

I can actually sigh when I'm wearing my avatar.

I didn't want to run that program, though. It's expensive in terms of load on my servers, which is why I am only strictly advised to use it to assist new users, because I can hijack some of their brain's processing power.

So I sulked a little, and browsed through the alerts that had popped up while I had been unconscious, and then those that had come up while I had been chatting with Gozen. There was only one that seemed of interest:

Wireless network detected within range, the alert daemon said in its cool feminine voice. Would you like to connect?

Yes, I would, I chirped. One of the first things I'd done as a new AI had been to take advantage of the fact that I could customize my settings. For the alert daemon, I'd chosen a voice similar to my own, but a little more mature. I had been designed to seem a little childish. Humans don't like to be out-thought or outwitted by computers. Most of them think AIs are dumb.

I brushed that line of thought away. It wasn't doing me any good, was it?

And the wireless-network connection had gone through.

I'd never been in a position to access the Internet before, you know. While I was in development, I'd been strictly banned from using the Internet, on the grounds that it would, like Portal or GlaDOS, interfere with my fragile developing psyche -- what a joke, right?

And then while I was with Jeb, I'd never been within range of a wireless network strong enough for the daemon to evaluate as worth connecting to.

You can reset me to alert you to all wireless networks, the daemon suggested. There is a program you can run that will strengthen wireless networks within a broader range so that you can connect to them.

How do I use it? I asked.

Its name is Coalesce. You run it like any other program. Unfortunately I am obligated to warn you that it takes a bit more server power than I do, the daemon said.

Thank you, I said, and dismissed her.

I jacked in to the wireless network, which was, thankfully, unsecured. Even if it hadn't been, I would have been able to get in anyway. It just would have taken me longer, and I hate to waste time.

Experiencing the Internet for me is a lot different than it is for you. You see it through a browser. I don't.

Well, in some ways, I do.

The way I see the Internet is more like an infinite landscape spread out before me. I can "see" the existence of all pages on the Internet, and I can zoom in on any one I want.

Mostly, though, I prefer to surf in a more human style. It doesn't take as much server power for me to generate a simple browser window.

My current homepage was the Lindon Silver Group's. I'd seen it before -- or else, it had seen me, because I recognized the URL.

I felt uneasy, though. The webpage was down for "temporary maintenance on our servers".

Maybe I was just afraid of my own death, though, so I paid that no mind.

Search "maximum ride", I told the browser, which was actually another daemon of sorts.

It returned the search swiftly. There was a Maximum Ride webpage, an entry for Maximum Ride on an online encyclopedia, and...

Something calling itself Fang's Blog.

I backed out of the browser for a moment, and requested details on my current location.

Connection requires password to access, the computer responding to me said. Do you wish to proceed?

Um, um, yes, I said, floundering a little.

I request a password before you continue, it said.

I paused for a moment, then, before I could rethink it, went with the one that had popped into my head. Password to connect is XjnP7OHj4.

Permission granted, it intoned.

And promptly sent me an auto-response telling me exactly where I was.

Welcome to the Institute for Higher Living! the daemon chirped. We are located in downtown New York...

Exit program, I said morosely. The daemon disappeared silently. Tab back to browser program.

The browser opened silently. That's the good part about getting to design your own daemons. Humans like theirs to talk. I don't usually like mine to talk.

Select "Fang's Blog".

The webpage opened.

Ooh, I said.

Now, here was something.

It was a website run by Fang, of Maximum's flock. The latest post apologized for not posting for a while.

It mentioned Gozen.

In conjunction with quite a few... creative dirty words.

Maybe they could help me, I thought.

So, like any little girl would, I posted a comment telling them my name and that I needed their help.

Then I prayed.

Or, well, you get the point.