Summary: the problem with Pascal's Mugging arguments is that, intuitively, some probabilities are just too small to care about. There might be a principled reason for ignoring some probabilities, namely that they violate an implicit assumption behind expected utility theory. This suggests a possible approach for formally defining a "probability small enough to ignore", though there's still a bit of arbitrariness in it.
Bounded utility functions effectively give "bounded probability functions," in the sense that you (more or less) stop caring about things with very low probability.
For example, if my maximum utility is 1,000, then my maximum utility for something with a probability of one in a billion is .0000001, an extremely small utiliity, so something that I will care about very little. The probability of of the 3^^^3 scenarios may be more than one in 3^^^3. But it will still be small enough that a bounded utility function won't care about situations like that, at least not to any significant extent.
That is precisely the reason that it will do the things you object to, if that situation comes up.
That is no different from pointing out that the post's proposal will reject a "mugging" even when it will actually cost 3^^^3 lives.
Both proposals have that particular downside. That is not something peculiar to mine.
Bounded utility functions mean you stop caring about things with very high utility. That you care less about certain low probability events is just a side effect. But those events can also have very high probability and you still don't care.
If you want to just stop caring about really low probability events, why not just do that?