jimrandomh comments on Less Wrong Q&A with Eliezer Yudkowsky: Ask Your Questions - Less Wrong

16 Post author: MichaelGR 11 November 2009 03:00AM

You are viewing a comment permalink. View the original post to see all comments and the full post content.

Comments (682)

You are viewing a single comment's thread.

Comment author: jimrandomh 12 November 2009 02:44:20AM 16 points [-]

What is the probability that this is the ultimate base layer of reality?

Comment author: MichaelHoward 12 November 2009 11:24:19PM 0 points [-]

And then... Really? What would be a fair estimate if you were someone not especially likely to be simulated, living in a not particularly critical time, and there was only, say, a trillionth as much potential computronium lying around?

Comment author: MichaelBishop 12 November 2009 04:13:59PM 0 points [-]

could you explain more what this means?

Comment author: jimmy 12 November 2009 07:13:05PM 2 points [-]

I think he means "as opposed to living in a simulation (possibly in another simulation, and so on)"

This seems to be one of those questions that seem like they should answer, but actually don't.

If there's at least one copy of you in "a simulation" and at least one in "base level reality", then you're going to run into the same problems as sleeping beauty/absent minded driver/etc when you deal with 'indexical probabilities'.

There are Decicion Theory answers, but the ones that work don't mention indexical probabilities. This does make the situation a bit harder than say, the sleeping beauty problem, since you have to figure out how to weight your utility function over multiple universes.