Summary: the problem with Pascal's Mugging arguments is that, intuitively, some probabilities are just too small to care about. There might be a principled reason for ignoring some probabilities, namely that they violate an implicit assumption behind expected utility theory. This suggests a possible approach for formally defining a "probability small enough to ignore", though there's still a bit of arbitrariness in it.
Obviously it means you would be willing to trade 2 units of chocolate ice cream for 1 unit of vanilla. And over the course of your life, you would prefer to have more vanilla ice cream than chocolate ice cream. Perhaps before you die, you will add up all the ice creams you've ever eaten. And you would prefer for that number to be higher rather than lower.
Nowhere in the above description did I talk about probability. And the utility function is already completely defined. I just need to decide on a decision procedure to maximize it.
Expected utility seems like a good choice, because, over the course of my life, different bets I make on ice cream should average themselves out, and I should do better than otherwise. But that might not be true if there are ice cream muggers. Which promise lots of ice cream in exchange for a down payment, but usually lie.
So trying to convince the ice cream maximizer to follow expected utility is a lost cause. They will just end up losing all their ice cream to muggers. They need a system which ignores muggers.
This is definitely not what I mean if I say I like vanilla twice as much as chocolate. I might like it twice as much even though there is no chance that I can ever eat more than one serving of ice cream. If I have the choice of a small serving of vanilla or a triple serving of chocolate, I might still choose the vanilla. That does not mean I like it three times as much.
It is not about "How much ice cream." It is about "how much wanting".