Jayson_Virissimo comments on Open thread, July 29-August 4, 2013 - Less Wrong Discussion
You are viewing a comment permalink. View the original post to see all comments and the full post content.
You are viewing a comment permalink. View the original post to see all comments and the full post content.
Comments (381)
I have a question about the Simulation Argument.
Suppose that it's some point in the future, and we're able to run conscious simulations of our ancestors. We're considering whether or not to run such a simulations.
We are also curious about whether we are in a simulation ourselves, and we know that knowledge that civilizations like ours run ancestor simulations would be evidence for the proposition that we ourselves are in a simulation.
Could the choice at this point whether or not to run a simulation be used as a form of acausal control over the probability that we ourselves are living in a simulation?
Taboo "acausal control."
Hmm, okay, to put it another way -- if we avoid running ancestor simulations for the purpose of maximizing the probability that we are not in a simulation, is it valid to, based on this fact, increase our credence in not being in a simulation?
I think so. If we decided not to run a simulation, any would-be-simulators analogous to us would also choose not to run a simulation, so you've eliminated a bunch of worlds where simulations are possible.
Only if those simulators are extremely similar to us. It may only take a very minor difference to decide to run simulations.
That is true, but irrelevant. Making the decision eliminates possible worlds in which we are simulations. Therefore we end up with fewer simulation-worlds out of our total list of potential future worlds, and thus our probability estimate must increase.
Or, to put it in Bayesian terms: P(we're in a simulation|we chose not to be in a simulation)/P(we choose not to be in a simulation) is greater than 1.
Sure, but by how much? If the ratio is something like 2 or even 5 or 10 this isn't going to matter much.
That's not the question.
That's the question, and the answer is "yes."
Unless you round sufficiently small increases down to zero, which is what people generally do. If somebody asked me that, and I estimated that the difference in probability was .00000000001, then I would answer "no".
That is granted. However, I'm also fairly sure (p=.75) that the probability isn't that small, because by deciding not to simulate a civilization yourself, you have greatly decreased the probability of being in an infinite descending chain. There remains singleton chance simulations and dynamic equilibria of nested simulations, but those are both intuitively less dense in clones of your universe - so you've ruled out a significant fraction of possible simulation-worlds by deciding not to simulate yourself yourself.