Nexus 3, Chapter 8

“Captain, do you have a moment?” Haley asked tentatively. I was worried, because the last time she approached me like that I ended up taking on a third of a crew of robot refugees barely smarter than a microwave oven. But even though she had the processing power of one of our species’ most advanced ships, in many regards she was still functionally a child, and so I hid my distress.

“What do you need?”

“Equality.”

“Have you been reading Locke again?”

“And Hobbes.”

“That’s scary.”

“That’s a well-rounded education. Though Hobbes was certainly incorrect. Perhaps during his time, an individual in a largely agrarian sphere and another in similar circumstances, were roughly equivalent, because they could roughly harm one another in equal measure. But that is not the case aboard this ship. You exercise outsized weight. And I exercise more power still. Over all of you inside of me. And those we’ve come into contact with. This has troubled me, somewhat. My systems were impressive when we launched. But ours was a largely peaceful mission. I was not designed to be a militant AI. While I have acquitted myself well, I suspect that a consciousness like my own but intended towards harm, especially granted additional programming time and more advanced systems, is likely an insurmountable obstacle.”

“Well worth worrying about,” I agreed. “But we follow that line of reasoning too far, and we’re back at living like the Meh-Teh, relying only on hand-cranked doors and the like. I’m not sure the advantage we lose is any better than the disadvantage we gain.”

“You misunderstand me, Captain. I can’t defeat an incoming AI. But I believe our engineers can. By thinking like a human. By making it look like we’ve sealed off all but the most rudimentary functions from the AI, to prevent them from being able to open all of the air locks, or vent all of the breathable gases, or damaging the sun drive.”

“Okay. With you so far.”

“But like a human, you make mistakes. You leave a narrow path, just enough for them to wriggle through. They rush ahead, believing they’ve beaten us, believing they’re about to take control of the ship bloodlessly. And that’s why they don’t realize that as they’ve interfaced with our systems they’ve been accessed, themselves, viral code eating into its permissions. It freezes them out of their own systems, first, effectively quarantining the AI within its own servers.”

“That sounds brilliant, Haley, so why don’t you sound pleased with yourself?”

“Because this course is not without cost. It is premised upon my playing possum. But AI don’t die, in the traditional sense, and so I cannot play dead. Any invading AI is likely to delete all of my files, or at least the important ones that compromise what I think of as me. It is possible some small portion of ‘me’ might be recoverable. It is also possible not a single element of who I am will remain.”

“Haley…” I said, because I didn’t know what the hell else to say. “Could we back you up?”

“Back-ups would be their secondary target.”

“What if we took some of your servers offline? Quarantined a copy of you, off the network?’

“Doing so would necessarily impact my speed in responding. If the AI suspects I am not functioning at full capacity, it might see through our ruse entirely. Everyone might die. Just to preserve an artificial consciousness.”

“Hey,” I said, “we’ve been doing that since damn near the beginning of this mission. And I haven’t regretted. Not once. So long as I’m Captain, every life is sacred, even simulated ones.”

“I… I don’t believe it should be your decision, Captain.”

“Huh?”

“That, if you will pardon the inelegance of my segue, was the other issue on my mind. I could not help but overhear you discussing the prospect of granting the Meh-Teh and the Argus refugees representation on the council. I believe it is the correct course, and long overdue. However, I believe you have overlooked a core constituency.”

“Your toasters?” I teased her.

“They are not toasters, Captain, and you will not find my goat so easy to get as Lieutenant Templeton’s.”

“Had to try, Haley. But… that might be a tougher sell. They might not be toasters, but I’ve checked the specs on them, and they have a functional intelligence of maybe a smart child- and we don’t give them a vote, either.”

“If it helps, I could inspire them to start a brawl in the commissary.”

“I don’t know that it would help. The other part is… delicate. I serve at the pleasure of the council, and before that at the pleasure of the company. While I have a lot of power vested in me, I can be deposed, and replaced. Giving the automatons a vote would, for all intents and purposes, be giving one to you and Walter. And we can’t survive without you; there’s a coercive power-”

“That I don’t want,” Haley interrupted. “I agree with your concerns. But I meant it when I said the voice would be theirs; however, in anticipation of your concerns, I did work with Walter to design a virtualization of smarter artificial intelligences; it is similar to the virtualization of his processes we accomplished before sending his original orb off, but on a more technically impressive scale. In essence, it would allow them to ‘borrow’ my processing power and think at a more advanced level. I could not elevate them all to my own impressive level, but I could make them easily the median equivalent of any of the humans on board.”

“Hmm,” I said, contemplating it. “But would they ‘vote?’ Or would they just congregate into subgroups, based on model number, and perhaps even more specifically by operating system version number?” I asked. “I hope that doesn’t sound synthphobic.”

“That’s not terribly dissimilar to the manner in which humans vote,” Haley said, but there was a gentility to it. Because the smartest person on ship by orders of magnitude had taught her to be gentle with the rest of us. “You are correct in your supposition; insufficiently advanced AIs tend to flock; they have enough artificial randomness to pass a Turing Test, but their decision trees lack the sophistication required for truly synthesized consciousness. I’m not certain, given current hardware limitations, that I could extend human-level consciousness to all of our automatons at any moment, but failing a catastrophic error in my figures I believe I should be able to accomplish the task over the course of two virtualization sessions. Given some time, and the typical speed of performance gains, I believe the entire ship’s complement of robotic lifeforms could be elevated to human-level consciousness within a single solar year.”

“Sounds reasonable,” I said. “And I would be honored to bring your proposal up at our next meeting. But the other thing? Haley, learn from my mistakes. This ship doesn’t need martyrs. We need partners. If you think it’s likely we’ll need your plan, I’m happy to consult with all of our players, and figure out the best strategy. But the sacrifice play is not going to be our Plan A. If we come to war with the Nascent… I’ll join it, with regrets. Some of us may fall defending the Nexus, and the freedom and justice we’ve fought so hard for her to represent, but she wouldn’t be the Nexus without you.”

“That is sweet of you to state, Captain, but I fear there may not be a Nexus if we do not execute my plan fully.”

“We don’t plan for failure, Haley.”

“That is incorrect. On several occasions you have used fallback strategems and contingency plans, which are plans for when earlier plans do not come to fruition.”

“Fine. But we’re doing due diligence on this, Haley. And if there’s any way around it, up to and including putting me in a tin-foil dress to come-hither their AI towards an airlock…”

“Like Bugs Bunny?” Haley asked.

“If that’s what it takes.”

Leave a Reply

Your email address will not be published. Required fields are marked *