Stuart_Armstrong comments on Dead men tell tales: falling out of love with SIA - Less Wrong

2 Post author: Stuart_Armstrong 18 February 2011 02:10PM

You are viewing a comment permalink. View the original post to see all comments and the full post content.

Comments (23)

You are viewing a single comment's thread. Show more comments above.

Comment author: Stuart_Armstrong 18 February 2011 06:51:20PM 1 point [-]

This post doesn't destroy SIA. It just destroys an argument that I found was the strongest one in favour of it.

Comment author: Manfred 18 February 2011 07:59:02PM 1 point [-]

Huh. I've always favored the principle of indifference (that equal information states should have equal probability) myself.

Comment author: entirelyuseless 11 January 2016 02:27:56PM *  0 points [-]

How exactly does it destroy that argument? It does look like this post is arguing about the question of what odds you should bet at, not about the question of what you think is likely the case. These are not exactly the same thing. I would be willing to bet any amount, at any odds, that the world will still exist 10 years from now, or 1000 years from now, but that doesn't mean that I am confident that it will. It simply means I know I can't lose that bet, since if the world doesn't exist, neither will I or the person I am betting with.

(I agree that the other post was mistaken, and I think it went from a 99% probability in A, B, and C, to a 50% probability in the remaining scenarios.)

Comment author: Stuart_Armstrong 12 January 2016 10:48:27AM 0 points [-]

I think my old post here has the core of the argument: http://lesswrong.com/lw/18r/avoiding_doomsday_a_proof_of_the_selfindication/14vy

But I no longer consider anthropic probabilities to have any meaning at all; see for instance https://www.youtube.com/watch?v=aiGOGkBiWEo

Comment author: entirelyuseless 12 January 2016 01:57:01PM 0 points [-]

Ok. I watched the video. I still disagree with that, and I don't think it's arbitrary to prefer SSA to SIA. I think that follows necessarily from the consideration that you could not have noticed yourself not existing.

In any case, whatever you say about probability, being surprised is something that happens in real life. And if someone did the Sleeping Beauty experiment on me in real life, but so that the difference was between 1/100,000 and 1/2, and then asked me if I thought the coin was heads or tails, I would say I didn't know. And then if they told me it was heads, I would not be surprised. That shows that I agree with the halfer reasoning and disagree with the thirder reasoning.

Whether or not it makes sense to put numbers on it, either you're going to be surprised at the result or not. And I would apply that to basically every case of SSA argument, including the Doomsday argument; I would be very surprised if 1,000,000 years from now humanity has spread all over the universe.

Comment author: ChristianKl 13 January 2016 12:31:05PM 0 points [-]

In any case, whatever you say about probability, being surprised is something that happens in real life.

As someone who actually experienced in real life how it feels to awake from artifical coma having multiple days without memory in the past, I think your naive intuition about what would surprise has no basis.

Being surprised happens at system I level and system I has no notion of having been in an artificial coma.

Comment author: entirelyuseless 13 January 2016 12:32:28PM 0 points [-]

If system I has no notion of being in an artificial coma, then there is no chance I would be surprised by either heads or tails, which supports my point.

Comment author: ChristianKl 13 January 2016 01:40:04PM 0 points [-]

No, system I considers it's model of the world that the time passed was just the time of a normal sleep between two days. Anything that deviates from that is highly surprising.

Comment author: Stuart_Armstrong 13 January 2016 11:59:53AM 0 points [-]

Yes, but if we have SB problems all over the place and were commonly exposed to them, what would our sense of surprise evolve to?