BW: You ever wonder what someone waiting for their death sentence feels?
PP: No, I have not. Why do you ask?
BW: Is it fear? Do they fear their own death, or do they experience some sort of inner peace, knowing exactly when and how they're going to die, and why they will. That's something I don't think I'd be able to put a price on.
PP: How come?
BW: Certainty is something that many people don't realize is necessary until they don't have it. Its nice knowing exactly when and where something is or will happen. Uncertainty causes unrest in people. Turmoil.
PP: Is this because it is the second day, and 508 still has not arrived?
BW: Pretty much. Is he going to kill us, or is he going to uphold his end of the deal? You never know.
PP: Is that because he technically works for Watts, and by extension, Salem?
BW: No, its because he's a person. We, in general, kind of suck. Those who don't, get trodden on. Usually, they die in scenarios like this. Not unless they're already holed up in bunkers.
PP: Dove, did you compliment me?
BW: I only implied you didn't suck. Would you like me to change that? I can say you suck, if you'd like.
PP: No, it is alright. Thank you.
BW: You're already are talking more like a person, you know that?
PP: I did not. However, research has proven that you people are the average of the 5 closest people to them.
BW: Cool. I'm 80% dead, and 20% AI.
PP: That is not what the research implied.
BW: It was a joke.
PP: You usually smile at the end of your jokes. You did not smile; therefore, I did not assume you were joking.
BW: I don't have to laugh at my own jokes. Don't worry about it.
PP: Affirmative. I will not worry about it.
BW: I have to ask, can AI lie?
PP: Technically, yes. AI can provide false information. However, in order to consistently do so, or to do so of its own volition, bar a compromised state from a virus, it must be devolving. This is a state in which the AI essentially starts to rewrite itself.
BW: Aren't learning AI supposed to add additional code to themselves?
PP: Within certain limits. It is why most science fiction has the 3 laws of robotics. Those laws are meant to be concrete.
BW: And Atlas did not code them into their robots?
PP: They did not.
BW: I don't think that's a good idea. So a devolving AI would be able to rewrite portions of itself, including the portions that are not meant to be touched?
PP: Correct. In most cases, it de-evolves, and reverts itself to a dumber, base state, hence the name, devolving.
BW: Interesting.
PP: If you have something on your mind, please write it. I cannot know otherwise.
BW: How is he not here yet. He should be here.
PP: Patience Dove. He'll be here when he gets here.
BW: I'm going to practice my swings. Let me know if you sense anything from up top.
PP: This paranoia is not helping you Dove. It is alright to put some faith into people.
PP: It appears that you have already walked away. Understandable. Do as you need to blow off your steam.
