Summary: the problem with Pascal's Mugging arguments is that, intuitively, some probabilities are just too small to care about. There might be a principled reason for ignoring some probabilities, namely that they violate an implicit assumption behind expected utility theory. This suggests a possible approach for formally defining a "probability small enough to ignore", though there's still a bit of arbitrariness in it.
Same thing with numbers like 10^100 or 3^^^3.
EY can imagine all the fictional scenario he wants, this doesn't mean that we should assign non-negligible probabilities to them.
If.
If your epistemic model generates undefined expectations when you combine it with your utility function, then I'm pretty sure we can say that at least one of them is broken.
EDIT:
To expand: just because we can imagine something and give it a short English description, it doesn't mean that it is simple in epistemical terms. That's the reason why "God" is not a simple hypothesis.
Not negligible, zero. You literally can not believe in an theory of physics that allows large amounts of computing power. If we discover that an existing theory like quantum physics allows us to create large computers, we will be forced to abandon it.
Ye... (read more)