I'm reading Robin Hanson's Age of Em right now, and some of his analysis of mind emulations might help here. He explains that emulations have the ability to copy themselves into other ems that will from the moment of copying onward have different experiences and therefore act and think differently. That is to say, even if you are aware of many copies of yourself existing in other worlds, they are effectively different people from the moment of copying onward. The fact remains that you are the one that will experience the pain and have to live with that mem...
I don't agree that the indexical uncertainty argument works in this case. If you assume there are a million copies of you in the same situation, then every copy's posterior must be that their leg will be cut off.
If you know that only one copy's leg will be cut, however, then I agree that you may hold a posterior of experiencing pain 1/1000000. But that seems to me a different situation where the original question is no longer interesting. It's not interesting because for that situation to arise would mean confirmation of many-worlds theories and the ability to communicate across them, which seems like adding way too much complexity to your original setup.