HonoreDB comments on A Thought on Pascal's Mugging - Less Wrong

12 Post author: komponisto 10 December 2010 06:08AM

You are viewing a comment permalink. View the original post to see all comments and the full post content.

Comments (159)

You are viewing a single comment's thread.

Comment author: HonoreDB 12 December 2010 07:04:39AM 0 points [-]

The problem, as stated, seems to me like it can be solved by precommitting not to negotiate with terrorists--this seems like a textbook case.

So switch it to Pascal's Philanthropist, who says "I offer you a choice: either you may take this $5 bill in my hand, or I will use my magic powers outside the universe to grant you 3^^^^3 units of utility."

But I'm actually not intuitively bothered by the thought of refusing the $5 in that case. It's an eccentric thing to do, but it may be rational. Can anybody give me a formulation of the problem where taking the magic powers claim seriously is obviously crazy?

Comment author: Polymeron 04 May 2011 11:24:09AM 0 points [-]

The two situations are not necessarily equivalent.

See my most recent response in the Pascal's Mugging thread - taking into account the Mugger's intentions & motives is relevant to the probability calculation.

Having said that, probably the two situations ARE equivalent - in both cases an increasingly high number indicates a higher probability that you are being manipulated.

Comment author: wedrifid 13 December 2010 07:26:40AM 0 points [-]

The problem, as stated, seems to me like it can be solved by precommitting not to negotiate with terrorists--this seems like a textbook case.

That can work when the mugger is a terrorist. Unfortunately most muggers aren't. They're businessmen. Since the 'threat' issue isn't intended to be the salient feature of the question we can perhaps specify that the mugger would be paid $3 to run the simulation and is just talking to you in a hope of getting a better offer. You do negotiate under those circumstances.

For my part I don't like the specification of the problem as found on the wiki at all:

Now suppose someone comes to me and says, "Give me five dollars, or I'll use my magic powers from outside the Matrix to run a Turing machine that simulates and kills 3^^^^3 people."

Quite aside from the 'threat' issue I just don't care what some schmuck simulates on a Turing machine outside the matrix. That is a distraction.

Comment author: HonoreDB 13 December 2010 06:33:14AM 0 points [-]

No responses and a downvote. Clearly I'm missing something obvious.

Comment author: komponisto 13 December 2010 06:52:50AM *  0 points [-]

I wasn't the downvoter (nor the upvoter), and wouldn't have downvoted; but I would suggest considering the abstract version of the problem:

Given that, in general, a Turing machine can increase in utility vastly faster than it increases in complexity, how should an Occam-abiding mind avoid being dominated by tiny probabilities of vast utilities?