And your edit leaves you with an interesting conundrum.
It can put you in a situation where you see people around yourself adopting one of two strategies, and the people who adopt one strategy consistently win, and the people who adopt another strategy consistently lose, but you still refuse to adopt the winning strategy because you think the people who win are .. wrong.
I'm not sure if you can call that a win.
"Win" by what standards? If I think it is ontologically and factually incorrect - an intellectual mistake - to identify with your copies, then those who do aren't winning, any more than individual lemmings win when they dive off a cliff. If I am happy to regard a person's attitude to their copies as a matter of choice, then I may regard their choices as correct for them and my choices as correct or me.
Robin Hanson predicts a Malthusian galactic destiny, in which the posthuman intelligences of the far future are all poorer than human individuals ...
This is a thought exercise I came up with on IRC to help with the iffiness of "freezing yourself for a thousand years" with regards to continuity of self.
Let's say we live in a post-singularity world as uploads and are pretty bored and always up for terrible entertainment (our god is FAI but has a scary sense of humor ..). So some crazy person creates a very peculiar black cube in our shared reality. You walk into it, a fork of you is created and you duke it out via russian roulette. The winner walks out the other side.
Before entering, should you accurately anticipate dying with 50% probability?
I argued that you should anticipate surviving with 100% probability, since the single you that walked out of the box would turn out to be correct in his prediction. Surprisingly, someone disagreed.
So I extended the scenario by another black box with two doors, but this one is just a tunnel. In this case, everybody can agree that you should anticipate a 100% probability of surviving it unscathed. But if we delete our memory of what just happened when exiting the black boxes, and the boxes themselves, then the resulting universes would be indistinguishable!
One easy way to demonstrate this is to chain ten boxes and put a thousand dollars at the end. The person that anticipates dying with 50% probability (so over all the boxes, 1/1024 chance of surviving) would stay well outside. The person that anticipates surviving just walks through and comes away $1000 richer. "But at least my anticipation was correct", in this scenario, reminds me somewhat of the cries of "but at least my reasoning was correct" on the part of two-boxers.
What I'm wondering is: is there a general rule underlying this about the follies of allowing causally-indistinguishable-in-retrospect effects to differently affect our anticipation? Can somebody formalize this?