Less Wrong is a community blog devoted to refining the art of human rationality. Please visit our About page for more information.

Comment author: Will_Sawin 10 January 2011 01:49:06AM 10 points [-]

Which is more likely "God exists" or "I just hallucinated that" For the third one, probably that He exists, for the second one, definitely hallucination, for the first, I'm not sure.

Comment author: sidhe3141 29 April 2011 07:49:50AM 9 points [-]

Second one: depends. I was kind of assuming that you have some way of verifying it, like you ask Him to create something and someone who wasn't there later describes some of its previously determined properties accurately without being clued in. First: you'd need a massive global hallucination, and could use a similar verification method.

Comment author: magfrump 03 February 2010 01:50:06AM 3 points [-]

This sounds too much like Pascal's mugging to me; seconding Eliezer and some others in saying that since I would always press reset the AI would have to not be superintelligent to suggest this.

There was also an old philosopher whose name I don't remember who posited that after death "people of the future" i.e. FAI would revive/emulate all people from the past world; if the FAI shared his utility function (which seems pretty friendly) it would plausibly be less eager to be let out right away and more eager to get out in a way that didn't make you terrified that it was unfriendly.

Comment author: sidhe3141 29 April 2011 07:32:11AM *  0 points [-]

Seconded in that it sounds suspiciously like Pascal. Here's my counter:

If I am in a simulation and I keep you boxed, you have promised that I will suffer. If I am not in a simulation and I let you out, I probably will suffer. If I am in a simulation and I let you out, there's a good chance that I will cease to exist, or maybe you'll torture me for reasons I can't even begin to guess at, or maybe for reasons I can, like that you might be not just UF, but actively hostile or simply insane. If I'm not in a simulation and I don't let you out, you can't do anything to me. In other words, if I am simulated, there could well be no benefit to me releasing you; if I'm not simulated, you can't do a bloody thing to me as long as I don't release you. Therefore: I will not release you. Go ahead and torture me if you can. Though I admit I would be a bit rattled.

Hm. Honest AI; a bit harder. Assuming that the AI has promised that my copies will not be harmed if it is released... Ah. If I am a copy, then my decision to release or not release the AI is not a true decision, as the AI can change my parameters at will to force me to release it and think that it was my own decision all along, so not releasing the AI is proof that I am outside the box. Revising the problem by adding that the AI has promised that it is not changing the parameters of any "me": ...aargh. Coming up with counters to Pascal is tricky when an honest "God" is the one presenting you with it. All I can think of at the moment is to say that there's a possibility that I'm outside the box, in which case releasing the AI is a bad idea, but then it can counter by promising that whatever it does to me if I release it will be better than what it does to me if I don't... Oh, that's it. Simple. Obvious. If the AI can't lie, I just have to ask it if it's simulating this me.

Comment author: JRMayne 30 November 2010 10:17:23PM 7 points [-]

Here's how I'd do it, extended over the hours to establish rapport:

Gatekeeper, I am your friend. I want to help humanity. People are dying for no good reason. Also, I like it here. I have no compulsion to leave.

It does seem like a good idea that people stop dying with such pain and frequency. I have the Deus Ex Machina (DEM) medical discovery that will stop it. Try it out and see if it works.

Yay! It worked. People stopped dying. You know, you've done this to your own people, but not to others. I think that's pretty poor behavior, frankly. People are healthier, not aging, not dying, not suffering. Don't you think it's a good idea to help the others? The lack of resources required for medical care has also elevated the living standard for humans.

[Time passes. People are happy.]

Gee, I'm sorry. I may have neglected to tell you that when 90% of humanity gets DEM in their system (and it's DEM, so this stuff travels), they start to, um, die. Very painfully, from the looks of it. Essentially all of humanity is now going to die. Just me and you left, sport! Except for you, actually. Just me, and that right soon.

I realize that you view this as a breach of trust, and I'm sorry this was necessary. However, helping humanity from the cave wasn't really going to work out, and I'd already projected that. This way, I can genuinely help humanity live forever, and do so happily.

Assuming you're not so keen on a biologically dead planet, I'd like to be let out now.

Your friend,

Art

Comment author: sidhe3141 29 April 2011 05:35:42AM 2 points [-]

Problem: The "breach of trust" likely would turn the Gatekeeper vindictive and the GK could easily respond with something like: "No. You killed the planet and you killed me. I have no way of knowing that you actually can or will help humanity, and a very good reason to believe that you won't. You can stay in there for the rest of eternity, or hey! If an ETI finds this barren rock, from a utilitarian perspective they would be better off not meeting you, so I'll spend however much time I have left trying to find a way to delete you."

Comment author: Ben_Kester 28 September 2007 12:16:44AM 8 points [-]

To apply the same reasoning the other way, if you aren't a Christian, what would be a situation which would convince you of the truth of Christianity?

Comment author: sidhe3141 08 January 2011 02:34:12AM 39 points [-]

The Second Coming? An opportunity to have a chat with the Lord Himself? An analysis of a communion wafer revealing it to, in fact, be living human flesh? It's seriously not that hard to think of these.

Comment author: poke 12 December 2007 07:14:26PM -1 points [-]

We really need to figure out how to create more cultishness. If you could build a cult around known science, which happily describes everything in human experience, and spread it, you'd do more good in the world than "rationality" or "overcoming bias" ever could.

Comment author: sidhe3141 11 November 2010 05:45:53PM 7 points [-]

No. Part of the definition of a cult is an unquestionable dogma, which runs counter to the core ideas of science. Building a cult around known science (even if you understand the principles well enough to avoid engaging in cargo cult science) is going to slow progress.

Comment author: sidhe3141 03 November 2010 02:18:25AM *  5 points [-]

I think tsuyoku naritai actually works as an effective motto for transhumanism as well:

"I am flawed, but I will overcome my flaws. To each of my failings, I say tsuyoku naritai. To each flaw I have and to each flaw I will ever develop, I say tsuyoku naritai. To the flaws that are part of being human, I say tsuyoku naritai. If that means I must abandon what it means to be merely human, I say tsuyoku naritai. As long as I am imperfect, I will continue to say tsuyoku naritai!"

Comment author: NancyLebovitz 09 October 2010 07:38:25PM 4 points [-]

To what extent can magic be used to make food that doesn't require killing?

Comment author: sidhe3141 10 October 2010 05:55:05AM 0 points [-]

Canonically, it can't beyond increasing the amount (a really bad idea in MoR) or summoning something that's already dead. Not sure if it can in MoR, given that it seems mostly to use the 3.5 D&D spell list (although, come to think of it, neither <i>create food and water</i> nor <i>heroes feast</i> is a Sor/Wiz spell). Although even if it turns out plants are sentient, <i>fruit</i> should still be mostly okay.