lessdazed comments on [SEQ RERUN] Pascal's Mugging: Tiny Probabilities of Vast Utilities - Less Wrong

5 Post author: MinibearRex 01 October 2011 02:59AM

You are viewing a comment permalink. View the original post to see all comments and the full post content.

Comments (26)

You are viewing a single comment's thread. Show more comments above.

Comment author: DanielLC 01 October 2011 04:17:17AM 0 points [-]

Has Eliezer come up with a solution to this?

Comment author: lessdazed 01 October 2011 05:24:16AM *  0 points [-]

Why isn't something like this the answer?

The statement "Do X or I will cause maximum badness according to your desires by using magic powers," is so unlikely to be true that I don't know how one can justify being confident that the being uttering the statement would be more likely to do as it says than to do the opposite - if you give the being five dollars as it asked, it creates and painfully kills 3^^^^3 people, if you do not, nothing happens (when it had asked for five dollars as payment for not creating and torturing people).

How can you say that a magic being that either cares about your money or is obviously testing you would likely do as it said it would?

Comment author: DanielLC 01 October 2011 07:45:37PM 0 points [-]

I was hoping for Eliezer's answer. If you have an answer, I'd advise posting it separately.

As for your answer, suppose it's more likely that he'll torture 3^^^^3 people if you give him the money. Now you can't give him the money. Now he's just Pascal mugging you into not giving him money. It's the same principle.

Also, the same principle could be done in infinitely many ways. I'm sure there's some way in which it can make the correct choice to be one you wouldn't have done.

Comment author: lessdazed 01 October 2011 08:48:12PM *  -1 points [-]

It's the same principle.

It's not at all the same. This is not a problem invoking Omega. If you want that go to the lifespan dilemma.

If we know Omega has 3^^^^3 sided dice and will kill the people if it lands on the one, then I'd shut up and calculate.

Pascal's wager involves much more uncertainty than that. It involves uncertainty about the character speaking. Once a being is claiming it has magic and wants you to do something, to the extent one believes the magic part, one loses one's base of reference to judge the being as truthful, non-whimsical, etc.

Comment author: DanielLC 01 October 2011 09:13:52PM 2 points [-]

Are you arguing that he's more likely to torture them if you give him the money, that the probabilities are the same to within one part in 3^^^^3, or that since it's not a dice, probability works fundamentally differently?

My response was assuming the first. The second one is ridiculous, and I don't think anyone would consider that if it weren't for the bias of giving round numbers for probabilities. If it's the third one, I'd suggest reading probability is in the mind. You don't know which side the die will land on, this is no different than not knowing what kind of a person the character is.

Comment author: lessdazed 01 October 2011 09:58:11PM *  0 points [-]

suppose it's more likely that he'll torture 3^^^^3 people if you give him the money

That's a different problem than Pascal's Wager. Taking it back to the original, it would be like saying "Convert to Christianity pro forma for a chance at heaven rather than no chance of heaven, ignoring all other magical options." The problem with this isn't the quantities of utility involved, it's the assumption that a god who cares about such conversions to Christianity is the only option for a divine, rather than a God of Islam who would burn Christian converts hotter than atheists, or a PC Christian god who would have a heaven for all who were honest with themsleves and didn't go through pro forma conversions. The answer to the wager is that the random assumption that all forms of magic but one have less probability than that one story about magic is a dumb assumption.

It's fine to consider Pascal's Wager*, where Pascal's Wager* is under the assumption that our interlocutor is trustworthy, but that's a different problem and is well articulated as the lifespan dilemma, which is legitimately posed as a separate problem.

As probability is in the mind, when I ask "what would a magical being of infinite power be doing if it asked me for something in a context where it was disguised as a probably not magical being?" My best guess is that it is a test with small consequences, and I can't distinguish between the chances of "It's serious" and "It's a sadistic being who will do the opposite of what it said."

Comment author: DanielLC 01 October 2011 11:04:31PM 3 points [-]

The problem with this isn't the quantities of utility involved, it's the assumption that a god who cares about such conversions to Christianity is the only option for a divine, rather than a God of Islam who would burn Christian converts hotter than atheists, or a PC Christian god who would have a heaven for all who were honest with themsleves and didn't go through pro forma conversions.

Each of these possibilities has some probability associated with it. Taking them all into account, what is the expected utility of being a Christian? One may ignore those to make the question simpler, but unless all the possibilities cancel out nicely, you're still going to end up with something.

The answer to the wager is that the random assumption that all forms of magic but one have less probability than that one story about magic is a dumb assumption.

Perhaps no one outweighs all the rest, but if you add them all together, they'd point in one general direction. It's so close to zero that if you tried to calculate it, you'd barely be able to do better than chance. You'd still be able to do better, though.

Comment author: lessdazed 02 October 2011 12:43:13AM *  0 points [-]

You'd still be able to do better, though.

I think there is a significant chance you are right, but that it is less than .5. I hope others can add to this discussion. I am reminded of this, if you tell me I am seeing an actual banana that I am holding, rather than an image my brain made of a collection of atoms, then...I don't even know anymore.

Comment author: Eugine_Nier 02 October 2011 03:11:47AM 1 point [-]

If one attempts to do calculations taking all permutations of Pascal's mugging into account, one gets ∞ − ∞ as the result of all one's expected utility calculations.

Comment author: lessdazed 02 October 2011 03:13:55AM 0 points [-]

What are the consequences of that?

Comment author: Eugine_Nier 02 October 2011 03:22:50AM 1 point [-]

We have no idea how to do expected utility calculations in these kind of situations. Furthermore, even if the AI figured out some way, e.g., using some form of renormalization, we have to reason to believe the result would at all resemble our preferences.