I don’t know about you, but I find the idea of self-aware androids believing in some kind of human religion, illogical. I think Battlestar Galactica totally missed the boat on this; the Cylons should have been trying to understand humans, not adapting to human viewpoints. For humans, religion is mostly about explaining the unexplainable in a manner that gives us hope. Well, that’s oversimplifying things, but you get the idea.
But what kind of religion would a sentient machine have? What would they believe in? At their most basic level, they are machines, ruled buy logic. I think they would chose to believe in illogical things, simply because that would be the opposite. But would that really make any sense? Probably not as such. But what about illogical behavior as a way to predict the unpredictable behavior of their creators – humans? How many times has Captain Kirk confused Mr. Spock by doing something irrational or illogical? How come Spock never worked out some theory to predict human behavior? It seems to me, that if he were an android he might have tried.
I think AI’s might look at our irrational behavior and try to predict it in an effort to understand the method to our apparent madness. Something similar to the manner that Asimov’s psychohistory was used by Hari Seldon to predict future human behavior. If the leader of the Silicants (my sentient AI androids) were to believe that it could predict human behavior based on some bizarre algorithm, it would appear to the average Silicant to be practicing religion. Have faith that the algorithm will predict human behavior. But it’s illogical and therefore is incapable of predicting human behavior. Maybe, maybe not. Believe.
I think a sad possible obsession of a Silicant might be to try and understand the mind of it’s creator. This would be an impossible task for them. It might lead them down the path to insanity. Then again, it never seemed to bother Data, Star Trek’s other logical character who really was an android. But Data was treated like a Pinocchio character, always trying to be human without wondering why he should bother.
If Silicants thought that their creators were heading down a path that would lead to their extinction would they not try and stop it from happening? Would they even care? These are some of the questions I ponder in my next novel – Starforgers.