If your utility has a limit, then you can't care about anything past that limit. Even a continuous limit doesn't work, because you care less and less about obtaining more utility, as you get closer to it. You would take a 50% chance at saving 2 people the same as a guaranteed chance at saving 1 person. But not a 50% chance at saving 2,000 people, over a chance at saving 1,000.
Yes, that would be the effect in general, that you would be less willing to take chances when the numbers involved are higher. That's why you wouldn't get mugged.
But that still doesn't mean that "you don't care." You still prefer saving 2,000 lives to saving 1,000, whenever the chances are equal; your preference for the two cases does not suddenly become equal, as you originally said.
Summary: the problem with Pascal's Mugging arguments is that, intuitively, some probabilities are just too small to care about. There might be a principled reason for ignoring some probabilities, namely that they violate an implicit assumption behind expected utility theory. This suggests a possible approach for formally defining a "probability small enough to ignore", though there's still a bit of arbitrariness in it.