magfrump comments on The Irrationality Game - Less Wrong

38 Post author: Will_Newsome 03 October 2010 02:43AM

You are viewing a comment permalink. View the original post to see all comments and the full post content.

Comments (910)

You are viewing a single comment's thread.

Comment author: magfrump 03 October 2010 04:43:38AM -2 points [-]

When it is technologically feasible for our descendants to simulate our world, they will not because it will seem cruel (conditional on friendly descendants, such as FAI or successful uploads with gradual adjustments to architecture.) I would be surprised if it were different, but not THAT surprised. (~70%)

Comment author: Eugine_Nier 03 October 2010 04:52:26AM 1 point [-]

Up voted because I disagree with your first statement.

Assuming reasonably complex values of stimulate, i.e., second life doesn't count.

Comment author: Relsqui 03 October 2010 07:18:58AM *  0 points [-]

I agree with you up 'til the first comma.

ETA: ... the only comma, I guess.

Comment author: Will_Newsome 03 October 2010 05:12:03AM 0 points [-]

Upvoted for disagreement: postulating that most of my measure comes from simulations helps resolve a host of otherwise incredibly confusing anthropic questions.

Comment author: Jonathan_Graehl 03 October 2010 07:12:16AM 3 points [-]

I'm sure there's more to it than came across in that sentence, but that sounds like shaky grounds for belief.

Comment author: Will_Newsome 03 October 2010 07:23:09AM 1 point [-]

Scientifically it's bunk but Bayesically it seems sound to me. A simple hypothesis that explains many otherwise unlikely pieces of evidence.

That said, I do have other reasons, but explaining the intuitions would not fit within the margins of my time.

Comment author: Jonathan_Graehl 03 October 2010 06:00:00PM 0 points [-]

I like thinking about being in a simulation, and since it makes no practical difference (except if you go crazy and think it's a good idea to test every possible means of 'praying' to any possible interested and intervening simulator god), I don't think we need to agree on the odds that we are simulated.

However, I'd say that it seems impossible to me to defend any particular choice of prior probability for the simulation vs. non-simulation cases. So while it matters how well such a hypothesis explains the data, I have no idea if I should be raising p(simulation) by 1000db from -10db or from -10000000db. If you have 1000db worth of predictions following from a disjunction over possible simulations, then that's of course super interesting and amusing even if I can't decide what my prior belief is.