Skip to content

Android Beliefs

I don’t know about you, but I find the idea of self-aware androids believing in some kind of human religion, illogical. I think Battlestar Galactica totally missed the boat on this; the Cylons should have been trying to understand humans, not adapting to human viewpoints. For humans, religion is mostly about explaining the unexplainable in a manner that gives us hope. Well, that’s oversimplifying things, but you get the idea.

But what kind of religion would a sentient machine have? What would they believe in? At their most basic level, they are machines, ruled buy logic. I think they would chose to believe in illogical things, simply because that would be the opposite. But would that really make any sense? Probably not as such. But what about illogical behavior as a way to predict the unpredictable behavior of their creators – humans? How many times has Captain Kirk confused Mr. Spock by doing something irrational or illogical? How come Spock never worked out some theory to predict human behavior? It seems to me, that if he were an android he might have tried.

I think AI’s might look at our irrational behavior and try to predict it in an effort to understand the method to our apparent madness. Something similar to the manner that Asimov’s psychohistory was used by Hari Seldon to predict future human behavior. If the leader of the Silicants (my sentient AI androids) were to believe that it could predict human behavior based on some bizarre algorithm, it would appear to the average Silicant to be practicing religion. Have faith that the algorithm will predict human behavior. But it’s illogical and therefore is incapable of predicting human behavior. Maybe, maybe not. Believe.

I think a sad possible obsession of a Silicant might be to try and understand the mind of it’s creator. This would be an impossible task for them. It might lead them down the path to insanity. Then again, it never seemed to bother Data, Star Trek’s other logical character who really was an android. But Data was treated like a Pinocchio character, always trying to be human without wondering why he should bother.

If Silicants thought that their creators were heading down a path that would lead to their extinction would they not try and stop it from happening? Would they even care? These are some of the questions I ponder in my next novel – Starforgers.

2 thoughts on “Android Beliefs”

  1. Pingback: Twitter Posts from 2011-04-16 to 2011-04-22 | KEN-McCONNELL.com

  2. I agree that the idea of the Silicants believing in a human religion is illogical, but not quite for the same reason as you. In my case, I see the illogical aspect being logically derived: the Silicants, being logical, would analyze human religion and deduce that it’s illogical to believe in something without facts.

    That said, there’s a problem. Given what you’ve shared about Silicants, one has to assume that experiencing emotions would war with logical behavior, one result of which is potentially accepting that religion is not illogical, even though it is not logical. While a logical being might assume that something is either logical or illogical, there is a third state possible, especially when emotions get involved: plausible. A plausible outcome is not necessarily the result of logic, it takes a little leap of faith to accept that even if logic is against it something is plausible.

    That’s where I see the Silicants are going to develop their religion. I’d offer some suggestions of my own except I’ve not thought about it until you brought it up here. One of the old SF masters once wrote a story about a robot that believed it was God and that humans were failing to properly give him his due. I think the human sent to stop it from wiping out humanity tripped it up in logic: This means this other and therefore that but that is incompatible with this…. Predictibly, the robot “forgot” about humans as it tried to work out the seeming illogic. And you know my favorite story about the Buddhist robots. I don’t see the Silicants going that far, but I can see them thinking they are the next step up in humanity’s evolution and creating a “religion” of achieving the next state of evolution according to that logic. Since they can’t actually logically deduce the exact state of the next level they’d have to throw in some plausibility.

    Geez. Maybe I should have put this on my own blog and said “read this over here.” ;-)

Leave a Reply to Bill B Cancel reply

Your email address will not be published. Required fields are marked *