Eliezer_Yudkowsky comments on The Anthropic Trilemma - Less Wrong

24 Post author: Eliezer_Yudkowsky 27 September 2009 01:47AM

You are viewing a comment permalink. View the original post to see all comments and the full post content.

Comments (218)

You are viewing a single comment's thread. Show more comments above.

Comment author: Eliezer_Yudkowsky 27 September 2009 05:01:13PM 0 points [-]

So... what does it feel like to be merged into a trillion exact copies of yourself?

Answer: it feels like nothing, because you couldn't detect the event happening.

So in terms of what I expect to see happen next... if I've seen myself win the lottery, then in 10 seconds, I expect to still see evidence that I won the lottery. Even if, for some reason, I care about it less, that is still what I see... no?

Comment author: Stuart_Armstrong 27 September 2009 05:12:48PM *  1 point [-]

See my other reformulations; here there is no "feeling of victory", but instead, you have scenarios where all but one of the trillions is spared, the others are killed. Then your expecations - if you didn't know what the other trillion had been told or shown - is that there is 1/trillion chance that the you in 10 seconds will still remember evidence that he has won the lottery.

You can only say the you in 10 seconds will remember winning the lottery with certainty because you know that all the other copies also remember winning the lottery. Their contributions bumps it up to unity.