I haven't yet entered this particular discussion, but it is of interest to me, so I hope you won't mind persisting a bit longer, with a different interlocutor.
This is why I don't like using numbers like 3E22 as probabilities.
May I ask just what your lower bound is on probability estimates?
I can't, really, because it's context dependent. If the question was "What is the probability that a program which selects one atom at random from all those in the universe (and is guaranteed by Omega genuinely random) picks this particular phosphorous atom on here the tip of my finger", then my probability would be much less than 3E22.
Likewise, "destroy the Earth" is a relatively simple occurrence - it just needs a big enough burst of energy or mass or something. If it's "What is the probability that the LHC will create a hamster ...
For background, see here.
In a comment on the original Pascal's mugging post, Nick Tarleton writes:
Coming across this again recently, it occurred to me that there might be a way to generalize Vassar's suggestion in such a way as to deal with Tarleton's more abstract formulation of the problem. I'm curious about the extent to which folks have thought about this. (Looking further through the comments on the original post, I found essentially the same idea in a comment by g, but it wasn't discussed further.)
The idea is that the Kolmogorov complexity of "3^^^^3 units of disutility" should be much higher than the Kolmogorov complexity of the number 3^^^^3. That is, the utility function should grow only according to the complexity of the scenario being evaluated, and not (say) linearly in the number of people involved. Furthermore, the domain of the utility function should consist of low-level descriptions of the state of the world, which won't refer directly to words uttered by muggers, in such a way that a mere discussion of "3^^^^3 units of disutility" by a mugger will not typically be (anywhere near) enough evidence to promote an actual "3^^^^3-disutilon" hypothesis to attention.
This seems to imply that the intuition responsible for the problem is a kind of fake simplicity, ignoring the complexity of value (negative value in this case). A confusion of levels also appears implicated (talking about utility does not itself significantly affect utility; you don't suddenly make 3^^^^3-disutilon scenarios probable by talking about "3^^^^3 disutilons").
What do folks think of this? Any obvious problems?