If you assume that the probability of somebody creating X lives decreases asymptotically as exp(-X) then you will not accept the deal.
I don't assume this. And I don't see any reason why I should assume this. It's quite possible that there exist powerful ways of simulating large numbers of humans. I don't think it's likely, but it's not literally impossible like you are suggesting.
Maybe it even is likely. I mean the universe seems quite large. We could theoretically colonize it and make trillions of humans. By your logic, that is incredibly improbable. For no other reason than that it involves a large number. Not that there is any physical law that suggests we can't colonize the universe.
I don't think it's likely, but it's not literally impossible like you are suggesting.
I'm not saying it's literally impossible, I'm saying that its probability should decrease with the number of humans, faster than the number of humans.
Maybe it even is likely. I mean the universe seems quite large. We could theoretically colonize it and make trillions of humans. By your logic, that is incredibly improbable. For no other reason than that it involves a large number.
Not really. I said "asymptotically". I was considering the tails of the distri...
Summary: the problem with Pascal's Mugging arguments is that, intuitively, some probabilities are just too small to care about. There might be a principled reason for ignoring some probabilities, namely that they violate an implicit assumption behind expected utility theory. This suggests a possible approach for formally defining a "probability small enough to ignore", though there's still a bit of arbitrariness in it.