Stuart_Armstrong comments on The Anthropic Trilemma - Less Wrong

24 Post author: Eliezer_Yudkowsky 27 September 2009 01:47AM

You are viewing a comment permalink. View the original post to see all comments and the full post content.

Comments (218)

You are viewing a single comment's thread.

Comment author: Stuart_Armstrong 27 September 2009 09:19:12AM 0 points [-]

Since I have a theory of Correlated decision making, let's use it! :-)

Let's look longer at the Nick Bostrom solution. How much contribution is there towards "feeling I will have won the lottery ten seconds from now" from "feeling I have currently won the lottery"? By the rules of this set-up, each of the happy copies contributes one trillionth towards that result.

(quick and dirty argument to convince you of that: replace the current rules by one saying "we will take the average feeling of victory across the trillion copies"; since all the feelings are exactly correlated, this rule gives the same ultimate result, while making it clear that each copy contributes one trillionth of the final result).

Thus Nick's position is, I believe, correct.

As for dealing with the fourth horn, I've already written on how to do that: here you have partially correlated experiences contributing to future feelings of victory, which you should split into correlated and anti-correlated parts. Since the divergence is low, the anti-correlated parts are of low probability, and the solution is approxiamtely the same as before.

Comment author: Eliezer_Yudkowsky 27 September 2009 05:01:13PM 0 points [-]

So... what does it feel like to be merged into a trillion exact copies of yourself?

Answer: it feels like nothing, because you couldn't detect the event happening.

So in terms of what I expect to see happen next... if I've seen myself win the lottery, then in 10 seconds, I expect to still see evidence that I won the lottery. Even if, for some reason, I care about it less, that is still what I see... no?

Comment author: Stuart_Armstrong 27 September 2009 05:12:48PM *  1 point [-]

See my other reformulations; here there is no "feeling of victory", but instead, you have scenarios where all but one of the trillions is spared, the others are killed. Then your expecations - if you didn't know what the other trillion had been told or shown - is that there is 1/trillion chance that the you in 10 seconds will still remember evidence that he has won the lottery.

You can only say the you in 10 seconds will remember winning the lottery with certainty because you know that all the other copies also remember winning the lottery. Their contributions bumps it up to unity.