Luke_A_Somers comments on A resolution to the Doomsday Argument. - Less Wrong Discussion
You are viewing a comment permalink. View the original post to see all comments and the full post content.
You are viewing a comment permalink. View the original post to see all comments and the full post content.
Comments (86)
Under MWI, you can win a lottery just by entering it; committing suicide is not necessary. Of course, almost all of you will lose.
All you're doing in quantum lotteries is deciding you really, REALLY don't care about the case where you lose, to the point that you want to not experience those branches at all, to the point that you'd kill yourself if you find yourself stuck in them.
That's the causality involved. You haven't gone out and changed the universe in any way (other than almost certainly killing yourself).
Replace "win a lottery" with "have a subjective probability of ~1 of winning a lottery".
That's wrong. If I found myself stuck in one, I would prefer to live; that's why I need a very strong precommitment, enforced by something I can't turn off.
Here's where we differ; I identify every copy of me as "me", and deny any meaningful sense in which I can talk about which one "I" am before anything has diverged (or, in fact, before I have knowledge that excludes some of me). So there's no sense in which I "might" die, some of me certainly will, and some won't, and the end state of affairs is better given some conditions (like selfishness, no pain on death, and lots of other technicalities).
I mean, younow would prefer to kill youthen.
As for your last paragraph, the framing was from a global point of view, and probability in this case is the deterministic, Quantum-Measure-based sort.
Not really. I prefer to kill my future self only because I anticipate living on in other selves; this can't accurately be described as "you really, REALLY don't care about the case where you lose, to the point that you want to not experience those branches at all, to the point that you'd kill yourself if you find yourself stuck in them."
I do care; what I don't care about is my measure between two measures of the same cardinality. If there was a chance of my being stuck in one world and not living on anywhere else, I wouldn't (now) want to kill myself in that future.
Ok, we sort of agree, then; but then your claim of "You haven't gone out and changed the universe in any way" seems weak. If I can change my subjective probability of experiencing X, and the state of the universe that's not me doesn't factor into my utility except insofar as it affects me, why should I care whether I'm "changing the universe"?
(To clarify the "I care" claim further; I'm basically being paid in one branch to kill myself in another branch. I value that payment more than I disvalue killing myself in the second branch; that does not necessarily mean that I don't value the second branch at all, just less than the reward in branch 1)