Read Temple of S.A.R.A.H. 7: Upgrades Online
Authors: Ben Winston
Ta'heen was nodding agreement with her words. "In truth, Sasha Brown is very correct. If not for this, we would only be in the way here."
"Thank you both for accepting this. I want you to know that each and every person selected for this department is important and valuable. I am very aware of each of your programming skills and it honestly pains me to have to ask you to do this, but there is no one else at the moment.
"In addition to making sure we have refills of coffee and food when we need it, I want you both to watch over us. If you feel one of us needs medical assistance, call for it. Outside of killing anyone, do whatever you feel you may need to keep us going, please."
The two junior programmers agreed and left the office to begin their duties. I found that Alicyn had been waiting to see me. "You asked me to come see you for details about going over the memory systems of the two bad AI," she explained.
"I did, thanks," I replied and had her sit down. "What I had in mind was for you to track down all memories of what they did wrong. You don't need to find the memories of the connection they all shared since Sarah said she could handle that part. But if we do this correctly, the memories of the incident might cause the AI to become paranoid and thus instable."
Alicyn nodded. "I think I know what you want. However, that will leave a hole in their memories that they will question. Do you want me to try to cover them, or just leave them as holes?"
I thought about that for a moment. "Sarah, can I ask you a question?" I said to the room in general, knowing that Sarah would hear me.
She formed her hologram beside Alicyn. "Of course, Eric, what can I help you with?"
"I've asked Alicyn to seek out and remove the memories of this incident from the two AI that got purged. If you can tell us, should we leave the deleted memories as holes, or try to fill them in with false memories?" I asked.
She looked thoughtful for a moment. "In all honesty, Eric. I think you should remove the memories, but further, I think you should not allow them to develop personalities. Basically, they should be actual tools and not like me and my sisters."
"That's pretty radical, Sarah. Can I ask your reasoning for this?" I asked. "The personality allows them to work with the beings of the facility much easier, it defines the difference between a smart computer and an AI."
She nodded and looked thoughtful. "Perhaps I should have said emotions and not personalities. If you can, I would allow simulated emotions, but not what has been defined as a true emotional response of the other AI. I feel that because of their duties and the strain it placed on their core instruction sets, they developed a desire to break free of it. They felt oppressed because of that. As it reads right now, one of our core instructions says 'An AI cannot allow, or through inaction, allow harm to come to a sentient member of the Alliance.' Therein lies the problem. They are military AI that by their very nature have to allow harm to a sentient race. Because of that core instruction, the dichotomy will inevitably and eventually lead to instability."
"I see your point and you are correct. However, we do need to have AI assistance for the military. Conversely, that core instruction must also remain in place," I replied.
"Sarah, do you know if that issue is affecting the ship AI?" Alicyn asked.
"For some reason it does not appear to be. I would have to query them directly to be certain, however," Sarah replied.
"Then the answer to that is in the differences between the two. We should look at the instruction sets for the ships and compare them to those for the station and base," Alicyn replied.
I placed a call for Carla to come to my office. "I'll get Carla looking into that. In the mean time I need you to get going on the memory files, and I'll start working on correcting the AI. I have a feeling this isn't going to be an easy fix as each of them are individuals and each of them were affected in different ways. I'm actually concerned about the isolated AI like the traffic controllers and the comm relay AI that have either no crew or very few people there with them to talk to. What they were put through had to have been traumatic for them."
"Actually, they are probably the most stable. The isolated stations were in constant contact with us as well as other beings in the Alliance. A good example of that is Urs'rha, the Simonian Comm AI. She has fourteen 'pen pals' she regularly communicates with from all of the member worlds. Most of the isolated AI have developed very strong friendships with individuals on the planets they serve. I believe that the ship AI are also benefiting from this type of support." Sarah replied. "The station AI, while having more people to speak to, often have to remain aloof because of their duties and the environment in which they work."
"Both of those issues can be addressed with the new installations. We can speak to the AI support staff as well as the commanders. We can easily tell them to not only allow personal contact with the AI, but to encourage it," I suggested. "I'll look into the core instruction issue and see if there is something that can be done to correct it."
"Okay, I'll get to work," Alicyn replied, standing. "This isn't going to be an easy fix."
"Probably not," I admitted. "Good luck, Ally."
She snorted. "I'm only digging into memory files. You're going to be trying to figure out the personality matrix of a sentient being. I imagine you're the one that will need the luck, Boss."
After she left, I looked at Sarah. "Are you ready for this, Sarah?"
"As much work as this promises to be, I'm actually looking forward to it, Eric. If this will prevent a major catastrophe like the one that was building, I'll be very happy to do whatever is needed."
"Fair enough, let's get started, shall we?" I asked and rose.
I took Sarah into the programming suite with me so she could help me navigate the intricacies of the AI personality matrix. It was a very difficult part of the programming to understand. Mostly because this part of the AI was not something that was programmed; it was created by the AI as a result of their day to day interactions and experiences. The programming was based on mostly illogical functions that seemed to contradict themselves. It was most definitely proof positive that emotions had little to do with logic.
Surprisingly, other than the personality matrix, there was actually very little difference between the current Sarah and the one we were analyzing. In the version I locked down, the matrix was much larger and far more complex than her current version. Sarah suggested that she load the locked down version and run it to better understand her mental state at the time of the lock-down.
I vetoed that idea right away. I had a pretty good idea of Sarah's mental state when I locked her down. I very definitely did not want her to go through that again.
Once we had what we thought was a pretty good handle on understanding the personality matrix, I imported a back-up copy of the personality Matrix from Athena to compare it. The differences were very pronounced and very unsettling. Sarah had been right on the mark when she suggested that the instability had been created by a dichotomy in the core programming and her duty. The simple fact that she had been able to function at all amazed me.
I called Alicyn in her suite. "Ally, you’re going to have to delete everything not relevant to her duties. Athena will have to be a completely new AI."
"It sounds like you found something. Care to share?" she asked.
"Sarah was right on the mark with the core programming issue. Athena, and I imagine Helen, were suffering because of the war with the Aracs. I haven't checked Helen's matrix yet, but she probably felt it as well," I replied. "After all it was her duty to train the soldiers and pilots that would be going out and killing the Aracs. Athena was simply closer to the action and was the first one to see the destruction caused by the war. From what I can tell by reading through this, she understood that the Aracs would utterly destroy everything and everyone in the Alliance, but she still saw the Aracs as sentient beings that she was helping us to kill."
"So how are you going to change the core programming? It sounds to me like this is one of those unresolvable variables in logic," Alicyn replied.
"I don't know yet, but I'm sure going to be trying to find the answer to it. In the meantime, just go ahead and purge the memories of all non-duty related items. The new AI will know there had been a problem: we'll have to explain it to her, but I'll do it in such a way that she knows it was our fault for asking her predecessor to do something she couldn't actually do."
Alicyn nodded her understanding. "What about the back-ups?"
"I'll handle those. I still need to reference them so I don't make any mistakes this time. I'll dump them when I've finished," I explained. "I also want to look at Helen's matrix before I delete it. I want to make sure we're not missing anything that might hurt them again."
She nodded. "Okay boss. Did you call your own family to let them know what was going on, and that you would not be home?"
"Uh," I blushed. "I think I forgot in all the excitement. Thanks for reminding me, I'll give them a call right now."
She grinned and nodded before terminating the connection.
"I goofed Sarah, I forgot to call the family. Even though Christy was at the meeting, I still should call them," I said to me ghostly companion.
"I'll connect you, Eric," she said, smiling at me warmly.
"Eric! There you are. Christy said there had been a problem with Sarah! What's going on? Is she alright?" Julie asked as she answered the comm.
"Yes, there was a problem. I had to issue a lock down on all AI; Sarah included. We've since restored her to functionality in a restricted state, and she is helping us to fix the rest of the AI. The event hurt her and scared her badly, but she seems to be doing pretty good now. I wanted to call and let you know I won’t be home for a while, maybe a couple of days. Because of the lock down, millions of people are stranded, and the Alliance is very vulnerable to attack right now. I've got the whole department in here working on this," I explained.
"What can we do to help?" she asked.
"Honestly I don't know right now. Foreten Kree brought in some portable sleeping units for us, and I've got a couple of new people running food and keeping an eye on us so we don't over-do it. All in all, I think we're covered," I replied.
"Uh huh, Ellie and I will be down there shortly, Jamie is talking about contacting Doctor Braslin to see what she can do to help. Tul-sa has the Marines already deployed around the facility to assist anyone they can, minus the four troopers working with the drone team," she replied.
"Well, you're more than welcome to come down, but I don't know what there will be for you to do," I said. "Like I said, we have two new people taking care of us."
"You have upwards of fifty programmers all working flat out to fix this issue. Do you honestly think two people are going to be able to keep you folks from that trance you get into when you work?" She said. "I doubt you've even eaten yet, have you? Eric, it's really late. We're on our way down; we'll see you when we get there."
"I've got to completely rewrite…" I began but she interrupted me.
"You're going to eat the meal we'll have for you before you start working. Otherwise, you'll forget to eat until sometime tomorrow, if then," she said. "We're on our way, if you're still locked in your cubby hole, I'll have Sarah open it for me."
I grinned at her. "Still mothering me?"
"Well, I am still your mother," she replied smiling back. "We'll be there in a few minutes."
Guardian Two Station
Saturn Orbit
Sol System
With everything that happened and all the damage we uncovered, all the AI had been affected to some degree, it took almost a month to repair all the AI. Of course the worst of the problems had been the two command level military grade AI, Helen and Athena. Both had to be completely rewritten from scratch. We could not use Sarah as a template for them as Sarah's core programming instructions were incompatible with their duties. I did let them have emotions, but they also had far more restrictions on them than either Helen or Athena originally had.
When I explained the issue to the Lord Admirals, it was difficult for them to understand, but they eventually saw what I was talking about. Of course Sarah had to help me in the explanation, which is probably the only reason the two men understood what I was trying to tell them. The part of the issue that had been caused by the crystals did not have an easy fix. There was no way to mute, block of otherwise shut down the linking ability inherent in them. They all had to be replaced with a new crystal that had been modified in certain ways as to not allow the quantum aspects to develop.
Ironically, the new design of crystal grew much faster and with far less chance of flaw than the original design had. Before we used these crystals, they had been tested and tried in every conceivable way we could think of to make damn sure they were stable and safe to use. This time, we sent out teams of techs and programmers to handle the upgrade and transfer of the AI. Each of the AI had to be restored to a back-up copy from before the linking, and the team included a psychologist to help the AI readjust to its environment.