Uncategorized

Will Super Intelligent Robots of the Future Be Religious?

At some point the human race will figure out how to augment it’s brains using technology and will pass from the scene and be replaced by super-intelligent robots.  Some have argued that these robots will not be religious.  I believe some of them will be.  It is possible that the robots will reflect upon the mystery of life with reverence, and if not, some robots may wish to have the robots be religious, worshiping a deity or deity, and will build them to be religious. By either path there will be religious robots.

Some believe that these super-intelligent robots of the future will not cook the books — i.e. they will not have one set of books for the tax man that misstates their profits and losses, and another real set of books. I don’t see why this should be the case.  Presumably the robots will have some common interests — analogous to our state — and also private interests — analogous to our personal bank accounts.  The robots that wish to keep more of limited resources for themselves would either control the information about their resources (whatever the robot analogy of money is) as it travels to the agency that keeps track of the robot communal projects, or will build robots to do so.

Some believe super-intelligent robots of the future will not have sex and fall in love and be jealous.  I don’t see why not.  There will be some decisions about how robots reproduce themselves.  These decisions if they require more than one robot to participate will be the robot analogue of our sex.  There will be robots who become emotionally obsessed with joining with other robots to reproduce themselves with them and this will be the robotic analogy of romantic love.  And sometimes these agreements will be broken and this will cause robot jealousy.

Some believe all robots will moosh together into a single hive organsm called the singularity.  I don’t see why anybody should expect this.  When australopithecus evolve into homo sapiens we did not merge into a singularity.  We split into nations, religions, families, and back-stabbing swindling individuals.  Why wouldn’t super-intelligent robots be equally riven by conflict and competition?

Unless you have a clear reason for why some human trait will vanish when we are replaced by robots, the best way to imagine robots is to imagine them as humans but super.

I am sure there will also be atheist robots, and robots that pay their taxes and conduct their sexual lives in a rational fashion.  These might be secular protestant robots.

Advertisements
Standard

13 thoughts on “Will Super Intelligent Robots of the Future Be Religious?

  1. One might argue that religion is the attempt to moosh together (under one exacting doctrine), but we’re not capable of it, physiologically.

    Also for robots to have money, I think that’s a bit of a double dip of you – if they have money then they have a religion.

      • Money…monah monah monah…sell me on the idea old robot kings used martial force to push robots from their resource gathering farms and into areas where they had to work, ostensibly for ‘money’ in order to exchange it for services that stop them from starving/shut down and rusting, but not of them have to call themselves slaves for it because they are getting the moneys – which simply must have the intrinsic values (because you die without it). Sell me on that to sell me that they’d simply have to have money.

        Actually I think AI’s would see through that – requiring some other more nefarious manipulation that’s as much above their intelligence aspect.

        You really don’t got to gets paid, when you can eat sunlight. Unless someone/something blots out the sun, so as to force you to needs to gets paid…

      • look the robot needs not just power he needs new metal attachments. IS he going to look to the state for a hand out? IS he going to beg? Or is he going to put in an honest day’s work?

      • Mmm-hmm – and say he can mine and smelt his own minerals as well as pour the steel quite capably (he is a robot, after all). But he’s driven off the land where he could do that.

        How honest is the days work when you’re doing it because someone took away your survival resource with martial force?

        I’m not sure what I’m missing, but I know this wont be enough. I mean ‘an honest days work’ – these are strong words. Strong commitments. They don’t flinch from a mere sentence or two.

      • I mean to me, if someone needs X to survive and you take it away from them and wont give it back unless, in your own words, they do ‘an honest days work’, I don’t think its honest.

        Is that crazy out there to think that? You could argue ‘but that’s not happening’. Ok, fair enough. But what I described – is it crazy to say that isn’t honest? Doesn’t seem to be to me.

  2. N.S. Palmer says:

    The future is uncertain, so I can’t say you’re wrong. However, I do have some reservations.

    First, humans evolved by natural selection. Darwin got the idea of natural selection partly from his familiarity with the artificial selection done by animal breeders. Natural selection favors traits conducive to producing more offspring, and those vary based on the environment and the species involved. However, artificial selection favors the traits chosen by the breeders for their own purposes. It seems to me that robot evolution, if it were to occur, would be more along the lines of artificial selection than natural selection. The results would therefore be unpredictable. Robots might be like humans and they might not.

    It’s not impossible that robots would have sex and fall in love and be jealous, but it’s impossible to predict with much assurance. As for conducting their sexual lives in a rational fashion, well, there’s a first time for everything. For us, the heart loves who it loves. And I’m not sure “rationality” would be an improvement in that context. It sounds boring.

    By the way, a good sci-fi novel that touches on some of the issues you raised is Larry Niven and Jerry Pournelle’s “Oath of Fealty” (http://www.amazon.com/Oath-Fealty-Larry-Niven-ebook/dp/B004LRPQQ6/). Two story points: (1) Humans can get neural implants that give them online mental access to something like the Internet, though the novel was written before the web was a thing; (2) A high-rise building has a problem with people committing suicide by jumping off its roof. To discourage suicides, the building manager installs a diving board at the edge with a sign that says “Think of it as evolution in action.”

    “Robots will feel happy and sad.” Agreed, but that’s a statement in public language. Private experiences of others (and ourselves, but that’s a more complicated argument) are unknowable. We can have them but we can’t talk meaningfully about their content, if it exists. Robots might behave in certain ways that we would describe as feeling happy and sad. The argument for attributing private experience to them is almost the same as the one for attributing private experience to other people.

    “If there’s a robot who’s a jerk to the first robot, always messing with him and frustrating his plans, the first robot is going to hate that robot.”

    Again, if we’re talking about behavior, it makes perfect sense. If a robot acts in pursuit of certain goals, and another robot consistently interferes, then the first robot will act in ways to prevent that, which could include behavior we’d call “hate.” We have no idea what private mental states, if any, the robot would experience.

    • If you’re not happy with using a public language you probably shouldn’t use the expressions “have” or “mental” or “states” or “experiences” either, as these are all parts of a public language. Just sayin’!

  3. Mikey says:

    If we’re going to enter a massive shift in terms of evolutionary speed – the super humans will not only be super at everything we do, but also super at making themselves better – then the arms races we are used to seeing these days might become obsolete.

    Today when one nation develops a new weapon, a few years later a few more nations have the weapon. Then everyone does. But what if we could develop so much faster that the lag between the ‘initial conception’ phase to the ‘total ownership of all the world’s output’ phase of, say, fusion bombs only took one day? Then we might expect one super power to take over everything and stay there. The only future threat to that nation would be internal, everything else would be subsumed. So perhaps this increase in speed of evolution makes a singularity more likely.

    • Maybe. But maybe the speed of centrifugal forces might increase as well. If one nation conquers the world today, tomorrow it’s restless provinces and overweening generals may fall at each other.

Leave a Reply

Fill in your details below or click an icon to log in:

WordPress.com Logo

You are commenting using your WordPress.com account. Log Out / Change )

Twitter picture

You are commenting using your Twitter account. Log Out / Change )

Facebook photo

You are commenting using your Facebook account. Log Out / Change )

Google+ photo

You are commenting using your Google+ account. Log Out / Change )

Connecting to %s