I still don't see how the two situations are different--for example, if I was talking to someone selling cryonics, wouldn't that be qualitatively the same as Pascal's Mugging?
Nah, the cryonics agent isn't trying to mug you! (Er, hopefully.) He's just giving you two options and letting you calculate.
In this case of Pascal's Mugging both choices lead to negative expected utility as defined by the problem. Hence you look for a third option, and in this case, you find one: ignore all blackmailers; tell them to go ahead and torture all those people, you don't care. Unless they find joy in torturing people (then you're screwed) they have no incentive to actually use up the resources to go through with it. So they leave you alone, 'cuz you won't budge.
Cryonics is a lot simpler in its nature, but a lot harder to calculate. You have two options, and the options are given to you by reality, not an agent you can outwit. (Throwing in a cryonics agent doesn't change anything.) When you have to choose between the binary cryonics versus no cryonics, it's just a matter of seeing which decision is better (or worse). It could be that both are bad, like in the Pascal's mugger scenario, but in this case you're just screwed: reality likes to make you suffer, and you have to take the best possible world. Telling reality that it can go ahead and give you tons of disutility doesn't take away its incentive to give you tons of disutility. There's no way out of the problem.
That opens a whole new can of worms that it's far too late at night for me to address, but I'm thinking of writing a post on this soon, perhaps tomorrow.
Cool! Be careful not to generalize too much, though: there might bad general trends, but no one likes to be yelled at for things they didn't do. Try to frame it as humbly as possible, maybe. Sounding unsure of your position when arguing against LW norms gets you disproportionately large amounts of karma. Game the system!
In this case of Pascal's Mugging both choices lead to negative expected utility as defined by the problem. Hence you look for a third option, and in this case, you find one: ignore all blackmailers; tell them to go ahead and torture all those people, you don't care.
That works for the LW version of the problem (and I understand why it does), but not for Bostrom's original formulation. In that version the mugger claims to have magic powers and will give Pascal quadrillions of utility if he hands over his wallet. This means that the mugger avoids the rule...
Please read the post before voting on the comments, as this is a game where voting works differently.
Warning: the comments section of this post will look odd. The most reasonable comments will have lots of negative karma. Do not be alarmed, it's all part of the plan. In order to participate in this game you should disable any viewing threshold for negatively voted comments.
Here's an irrationalist game meant to quickly collect a pool of controversial ideas for people to debate and assess. It kinda relies on people being honest and not being nitpickers, but it might be fun.
Write a comment reply to this post describing a belief you think has a reasonable chance of being true relative to the the beliefs of other Less Wrong folk. Jot down a proposition and a rough probability estimate or qualitative description, like 'fairly confident'.
Example (not my true belief): "The U.S. government was directly responsible for financing the September 11th terrorist attacks. Very confident. (~95%)."
If you post a belief, you have to vote on the beliefs of all other comments. Voting works like this: if you basically agree with the comment, vote the comment down. If you basically disagree with the comment, vote the comment up. What 'basically' means here is intuitive; instead of using a precise mathy scoring system, just make a guess. In my view, if their stated probability is 99.9% and your degree of belief is 90%, that merits an upvote: it's a pretty big difference of opinion. If they're at 99.9% and you're at 99.5%, it could go either way. If you're genuinely unsure whether or not you basically agree with them, you can pass on voting (but try not to). Vote up if you think they are either overconfident or underconfident in their belief: any disagreement is valid disagreement.
That's the spirit of the game, but some more qualifications and rules follow.
If the proposition in a comment isn't incredibly precise, use your best interpretation. If you really have to pick nits for whatever reason, say so in a comment reply.
The more upvotes you get, the more irrational Less Wrong perceives your belief to be. Which means that if you have a large amount of Less Wrong karma and can still get lots of upvotes on your crazy beliefs then you will get lots of smart people to take your weird ideas a little more seriously.
Some poor soul is going to come along and post "I believe in God". Don't pick nits and say "Well in a a Tegmark multiverse there is definitely a universe exactly like ours where some sort of god rules over us..." and downvote it. That's cheating. You better upvote the guy. For just this post, get over your desire to upvote rationality. For this game, we reward perceived irrationality.
Try to be precise in your propositions. Saying "I believe in God. 99% sure." isn't informative because we don't quite know which God you're talking about. A deist god? The Christian God? Jewish?
Y'all know this already, but just a reminder: preferences ain't beliefs. Downvote preferences disguised as beliefs. Beliefs that include the word "should" are are almost always imprecise: avoid them.
Additional rules: