Eliezer’s Anthropic Trilemma:
So here’s a simple algorithm for winning the lottery: Buy a ticket. Suspend your computer program just before the lottery drawing – which should of course be a quantum lottery, so that every ticket wins somewhere. Program your computational environment to, if you win, make a trillion copies of yourself, and wake them up for ten seconds, long enough to experience winning the lottery. Then suspend the programs, merge them again, and start the result. If you don’t win the lottery, then just wake up automatically. The odds of winning the lottery are ordinarily a billion to one. But now the branch in which you win has your “measure”, your “amount of experience”, temporarily multiplied by a trillion. So with the brief expenditure of a little extra computing power, you can subjectively win the lottery – be reasonably sure that when next you open your eyes, you will see a computer screen flashing “You won!” As for what happens ten seconds after that, you have no way of knowing how many processors you run on, so you shouldn’t feel a thing.
See the original post for assumptions, what merging minds entails etc. He proposes three alternative bullets to bite: accepting that this would work, denying that there is “any meaningful sense in which I can anticipate waking up as myself tomorrow, rather than Britney Spears” so undermining any question about what you should anticipate, and Nick Bostrom’s response, paraphrased by Eliezer:
…you should anticipate winning the lottery after five seconds, but anticipate losing the lottery after fifteen seconds. To bite this bullet, you have to throw away the idea that your joint subjective probabilities are the product of your conditional subjective probabilities. If you win the lottery, the subjective probability of having still won the lottery, ten seconds later, is ~1. And if you lose the lottery, the subjective probability of having lost the lottery, ten seconds later, is ~1. But we don’t have p(“experience win after 15s”) = p(“experience win after 15s”|”experience win after 5s”)*p(“experience win after 5s”) + p(“experience win after 15s”|”experience not-win after 5s”)*p(“experience not-win after 5s”).
I think I already bit the bullet about there not being a meaningful sense in which I won’t wake up as Britney Spears. However I would like to offer a better, relatively bullet biting free solution.
First notice that you will have to bite Bostrom’s bullet if you even accept Eliezer’s premise that arranging to multiply your ‘amount of experience’ in one branch in the future makes you more likely to experience that branch. Call this principle ‘follow-the-crowd’ (FTC). And let’s give the name ‘blatantly obvious principle’ (BOP) to the notion that P(I win at time 2) is equal to P(I win at time 2|I win at time 1)P(I win at time 1)+P(I win at time 2|I lose at time 1)P(I lose at time 1). Bostrom’s bullet is to deny BOP.
We can set aside the bit about merging brains together for now; that isn’t causing our problem. Consider a simpler and smaller (for the sake of easy diagramming) lottery setup where after you win or lose you are woken for ten seconds as a single person, then put back to sleep and woken as four copies in the winning branch or one in the losing branch. See the diagram below. You are at Time 0 (T0). Before Time 1 (T1) the lottery is run, so at T1 the winner is W1 and the loser is L1. W1 is then copied to give the multitude of winning experiences at T2, while L2 remains single.
Now using the same reasoning as you would to win the lottery before, FTC, you should anticipate an 80% chance of winning the lottery at T2. There is four times as much of your experience winning the lottery as not then. But BOP says you still only have a fifty percent chance of being a lottery winner at T2:
P(win at T2) = P(win at T2|win at T1).P(win at T1)+P(win at T2|lose at T1).P(lose at T1) = 1 x 1/2 + 0 x 1/2 = 1/2
FTC and BOP conflict. If you accept that you should generally anticipate futures where there are more of you more strongly, it looks like you accept that P(a) does not always equal P(a|b)P(b)+P(a|-b)P(-b). How sad.
Looking at the diagram above, it is easy to see why these two methods of calculating anticipations disagree. There are two times in the diagram that your future branches, once in a probabilistic event and once in being copied. FTC and BOP both treat the probabilistic event the same: they divide your expectations between the outcomes according to their objective probability. At the other branching the two principles do different things. BOP treats it the same as a probabilistic event, dividing your expectation of reaching that point between the many branches you could continue on. FTC treats it as a multiplication of your experience, giving each new branch the full measure of the incoming branch. Which method is correct?
Neither. FTC and BOP are both approximations of better principles. Both of the better principles are probably true, and they do not conflict.
To see this, first we should be precise about what we mean by ‘anticipate’. There is more than one resolution to the conflict, depending on your theory of what to anticipate: where the purported thread of personal experience goes, if anywhere. (Nope, resolving the trilemma does not seem to answer this question).
Resolution 1: the single thread
The most natural assumption seems to be that your future takes one branch at every intersection. It does this based on objective probability at probabilistic events, or equiprobably at copying events. It follows BOP. This means we can keep the present version of BOP, so I shall explain how we can do without FTC.
Consider diagram 2. If your future takes one branch at every intersection, and you happen to win the lottery, there are still many T2 lottery winners who will not be your future. They are your copies, but they are not where your thread of experience goes. They and your real future self can’t distinguish who is actually in your future, but there is some truth of the matter. It is shown in green.
Now while there are only two objective possible worlds, when we consider possible paths for the green thread there are five possible worlds (one shown in diagram 2). In each one your experience follows a different path up the tree. Since your future is now distinguished from other similar experiences, we can see the weight of your experience at T2 in a world where you win is no greater than the weight in a world where you lose, though there are always more copies who are not you in the world where you win.
The four worlds where your future is in a winning branch are each only a quarter as likely as one where you lose, because there is a fifty percent chance of you reaching W1, and after that a twenty five percent chance of reaching a given W2. By the original FTC reasoning then, you are equally likely to win or lose. More copies just makes you less certain exactly where it will be.
I am treating the invisible green thread like any other hidden characteristic. Suppose you know that you are and will continue to be the person with the red underpants, though many copies will be made of you with green underpants. However many extra copies are made, a world with more of them in future should not get more of your credence, even if you don’t know which future person actually has the red pants. If you think of yourself as having only one future, then you can’t also consider there to be a greater amount of your experience when there are a lot of copies. If you did anticipate experiences based on the probability that many people other than you were scheduled for that experience, you would greatly increase the minuscule credence you have in experiencing being Britney Spears when you wake up tomorrow.
Doesn’t this conflict with the use of FTC to avoid the Bolzmann brain problem, Eliezer’s original motivation for accepting it? No. The above reasoning means there is a difference between where you should anticipate going when you are at T0, and where you should think you are if you are at T2.
If you are at T0 you should anticipate a 50% chance of winning, but if you are at T2 you have an 80% chance of being a winner. Sound silly? That’s because you’ve forgotten that you are potentially talking about different people. If you are at T2, you are probably not the future of the person who was at T0, and you have no way to tell. You are a copy of them, but their future thread is unlikely to wend through you. If you knew that you were their future, then you would agree with their calculations.
That is, anyone who only knows they are at T2 should consider themselves likely to have won, because there are many more winners than losers. Anyone who knows they are at T2 and are your future, should give even odds to winning. At T0, you know that the future person whose measure you are interested in is at T2 and is your future, so you also give even odds to winning.
Avoiding the Bolzmann brain problem requires a principle similar to FTC which says you are presently more likely to be in a world where there are more people like you. SIA says just that for instance, and there are other anthropic principles that imply similar things. Avoiding the Bolzmann brain problem does not require inferring from this that your future lies in worlds where there are more such people. And such an inference is invalid.
This is exactly the same as how it is invalid to infer that you will have many children from the fact that you are more likely to be from a family with many children. Probability theory doesn’t distinguish between the relationship between you and your children and the relationship between you and your future selves.
Resolution 2
You could instead consider all copies to be your futures. Your thread is duplicated when you are. In that case you should treat the two kinds of branching differently, unlike BOP, but still not in the way FTC does. It appears you should anticipate a 50% chance of becoming four people, rather than an 80% chance of becoming one of those people. There is no sense in which you will become one of the winners rather than another. Like in the last case, it is true that if you are presently one of the copies in the future, you should think yourself 80% likely to be a winner. But again ‘you’ refers to a different entity in this case to the one it referred to before the lottery. It refers to a single future copy. It can’t usefully refer to a whole set of winners, because the one considering it does not know if they are part of that set or if they are a loser. As in the last case, your anticipations at T0 should be different from your expectations for yourself if you know only that you are in the future already.
In this case BOP gives us the right answer for the anticipated chances of winning at T0. However it says you have a 25% chance of becoming each winner at T2 given you win at T1, instead of 100% chance of becoming all of them.
Resolution 3:
Suppose that you want to equate becoming four people in one branch as being more likely to be there. More of your future weight is there, so for some notions of expectation perhaps you expect to be there. You take ‘what is the probability that I win the lottery at T1?’ to mean something like ‘what proportion of my future selves are winning at T1?’. FTC gives the correct answer to this question – you aren’t especially likely to win at T1, but you probably will at T2. Or in the original problem, you should expect to win after 5 seconds and lose after 15 seconds, as Nick Bostrom suggested. If FTC is true, then we must scrap BOP. This is easier than it looks because BOP is not what it seems.
Here is BOP again:
P(I win at T2) is equal to P(I win at T2|I win at T1)P(I win at T1)+P(I win at T2|I lose at T1)P(I lose at T1)
It looks like a simple application of
P(a) = P(a|b)P(b)+P(a|-b)P(-b)
But here is a more extended version:
P(win at 15|at 15) = P(win an 15|at 15 and came from win at 5)P(win at 5|at 5)+P(win at 15|at 15 and came from loss at 5)P(lose at 5|at 5)
This is only equal to BOP if the probability of having a win at 5 in your past when you are at 15 is equal to the probability of winning at 5 when you are at 5. To accept FTC is to deny that. FTC says you are more likely to find the win in your past than to experience it because many copies are descended from the same past. So accepting FTC doesn’t conflict with P(a) being equal to P(a|b)P(b)+P(a|-b)P(-b), it just makes BOP an inaccurate application of this true principle.
In summary:
1. If your future is (by definitional choice or underlying reality) a continuous non-splitting thread, then something like SIA should be used instead of FTC, and BOP holds. Who you anticipate being differs from who you should think you are when you get there. Who you should think you are when you get there remains as something like SIA and avoids the Bolzmann brain problem.
2. If all your future copies are equally your future, you should anticipate becoming a large number of people with the same probability as that you would have become one person if there were no extra copies. In which case FTC does not hold, because you expect to become many people with a small probability instead of one of those many people with a large probability. BOP holds in a modified form where it doesn’t treat being copied as being sent down a random path. But if you want to know what a random moment from your future will hold, a random moment from T1 is more likely to include losing than a random moment from T2. For working out what a random T2 moment will hold, BOP is a false application of a correct principle.
3. If for whatever reason you conceptualise yourself as being more likely to go into future worlds based on the number of copies of you there are in those worlds, then FTC does hold, but BOP becomes false.
I think the most important point is that the question of where you should anticipate going need not have the same answer as where a future copy of you should expect to be (if they don’t know for some reason). A future copy who doesn’t know where they are should think they are more likely to be in world where there are many people like themselves, but you should not necessarily think you are likely to go into such a world. If you don’t think you are as likely to go into such a world, then FTC doesn’t hold. If you do, then BOP doesn’t hold.
It seems to me the original problem uses FTC while assuming there will be a single thread, thereby making BOP look inevitable. If the thread is kept, FTC should not be, which can be conceptualised as in either of resolutions 1 or 2. If FTC is kept, BOP need not be, as in resolution 3. Whether you keep FTC or BOP will give you different expectations about the future, but which expectations are warranted is a question for another time.
I wrote in 2001:
Think of it as an extension of Eliezer's "make beliefs pay rent in anticipated experiences". I think beliefs should pay rent in decision making.
Katja, I'm not sure if this is something that has persuasive power for you, but it's an idea that has brought a lot of clarity to me regarding anthropic reasoning and has led to the UDT approach to anthropics, which several other LWers also seem to find promising. I believe anthropic reasoning is a specialization of yours, but you have mostly stayed out of the UDT-anthropics discussions. May I ask why?
I used to have the same viewpoint as your 2001 quote, but I think I'm giving it up. CDT,EDT, and TDT theorists agree that a coin flip is 50-50, so probability in general doesn't seem to be too dependent on decision theory.
I still agree that when you're confused, retreating to decisions helps. It can help you decide that it's okay to walk in the garage with the invisible dragon, and that it's okay for your friends to head out on a space expedition beyond the cosmological horizon. Once you've decided this, however, ideas like "there is no dragon" a... (read more)