You're looking at Less Wrong's discussion board. This includes all posts, including those that haven't been promoted to the front page yet. For more information, see About Less Wrong.

Jayson_Virissimo comments on Open thread, July 29-August 4, 2013 - Less Wrong Discussion

3 Post author: David_Gerard 29 July 2013 10:26PM

You are viewing a comment permalink. View the original post to see all comments and the full post content.

Comments (381)

You are viewing a single comment's thread. Show more comments above.

Comment author: ESRogs 30 July 2013 04:10:56AM 6 points [-]

I have a question about the Simulation Argument.

Suppose that it's some point in the future, and we're able to run conscious simulations of our ancestors. We're considering whether or not to run such a simulations.

We are also curious about whether we are in a simulation ourselves, and we know that knowledge that civilizations like ours run ancestor simulations would be evidence for the proposition that we ourselves are in a simulation.

Could the choice at this point whether or not to run a simulation be used as a form of acausal control over the probability that we ourselves are living in a simulation?

Comment author: Jayson_Virissimo 30 July 2013 05:08:27AM 4 points [-]

Taboo "acausal control."

Comment author: ESRogs 30 July 2013 09:57:29AM 4 points [-]

Hmm, okay, to put it another way -- if we avoid running ancestor simulations for the purpose of maximizing the probability that we are not in a simulation, is it valid to, based on this fact, increase our credence in not being in a simulation?

Comment author: linkhyrule5 30 July 2013 10:42:27PM 2 points [-]

I think so. If we decided not to run a simulation, any would-be-simulators analogous to us would also choose not to run a simulation, so you've eliminated a bunch of worlds where simulations are possible.

Comment author: JoshuaZ 31 July 2013 12:52:05AM 0 points [-]

Only if those simulators are extremely similar to us. It may only take a very minor difference to decide to run simulations.

Comment author: linkhyrule5 31 July 2013 01:07:49AM 1 point [-]

That is true, but irrelevant. Making the decision eliminates possible worlds in which we are simulations. Therefore we end up with fewer simulation-worlds out of our total list of potential future worlds, and thus our probability estimate must increase.

Or, to put it in Bayesian terms: P(we're in a simulation|we chose not to be in a simulation)/P(we choose not to be in a simulation) is greater than 1.

Comment author: JoshuaZ 31 July 2013 01:14:55AM 0 points [-]

Sure, but by how much? If the ratio is something like 2 or even 5 or 10 this isn't going to matter much.

Comment author: linkhyrule5 31 July 2013 02:30:00AM 0 points [-]

That's not the question.

if we avoid running ancestor simulations for the purpose of maximizing the probability that we are not in a simulation, is it valid to, based on this fact, increase our credence in not being in a simulation?

That's the question, and the answer is "yes."

Comment author: Kaj_Sotala 31 July 2013 04:05:36PM 1 point [-]

Unless you round sufficiently small increases down to zero, which is what people generally do. If somebody asked me that, and I estimated that the difference in probability was .00000000001, then I would answer "no".

Comment author: linkhyrule5 31 July 2013 07:00:58PM 0 points [-]

That is granted. However, I'm also fairly sure (p=.75) that the probability isn't that small, because by deciding not to simulate a civilization yourself, you have greatly decreased the probability of being in an infinite descending chain. There remains singleton chance simulations and dynamic equilibria of nested simulations, but those are both intuitively less dense in clones of your universe - so you've ruled out a significant fraction of possible simulation-worlds by deciding not to simulate yourself yourself.