Theoretically, that's the question he's asking about Pascal's Mugging, since accepting the mugger's argument would tell you that expected utility never converges. And since we could rephrase the problem in terms of (say) diamond creation for a diamond maximizer, it does look like an issue of probability rather than goals.
Theoretically, that's the question he's asking about Pascal's Mugging, since accepting the mugger's argument would tell you that expected utility never converges
Of course, and the paper cited in http://wiki.lesswrong.com/wiki/Pascal's_mugging makes that argument rigorous.
And since we could rephrase the problem in terms of (say) diamond creation for a diamond maximizer, it does look like an issue of probability rather than goals.
It's a problem of expected utility, not necessarily probability. And I still would like to know which axiom it ends up violating. I suspect Continuity.
Summary: the problem with Pascal's Mugging arguments is that, intuitively, some probabilities are just too small to care about. There might be a principled reason for ignoring some probabilities, namely that they violate an implicit assumption behind expected utility theory. This suggests a possible approach for formally defining a "probability small enough to ignore", though there's still a bit of arbitrariness in it.