JGWeissman comments on Open Thread: September 2009 - Less Wrong
You are viewing a comment permalink. View the original post to see all comments and the full post content.
You are viewing a comment permalink. View the original post to see all comments and the full post content.
Comments (179)
A simulation hypothesis such as "our universe is a simulation" is not falsifiable even given perfect knowledge of the universe at some point in time; maybe the universe has a definite beginning and end and it's simulated perfectly the whole way through. Therefore, I'll use the following definition of the simulation hypothesis: "The best description of the universe as we are capable of observing it describes our observations as happening entirely within a simulation crafted by optimizing processes."
Let's assume for the sake of convenience that "the" priors for the laws of physics are P, and let's call the distribution of universes that optimizing processes would simulate P'. The only necessary difference between P and P' is that P' is biased toward universes that are easy and/or useful to simulate. How easy a universe is to simulate in general can probably be estimated by how easy a universe is to simulate in itself. We have quantum mechanics but quantum computers have been late in coming, suggesting that our universe would be difficult to simulate. Now, as for utility, evolution optimizes for things that themselves optimize for reproduction, but it also produces optimization for pretty much random things. We can ignore the random things, and ask how useful our universe is for reproduction. I'm guessing that the universe, as it seems to involve lots of pointless computation, is not good for that.
So, given the above, I'd estimate the probability as being... oh, how does 20% sound?
Now, of course, the other thing to look for in a simulated universe is simulation artifacts: things that seem to not follow the laws of physics, and behaviors that are only approximations to how things should behave. Suffice to say, we haven't seen any of those.
Quantum computers are computers which use quantum superposition for parallel processing, and are not required for simulating quantum mechanics. And our "classical" computers do in fact take advantage of quantum mechanics, as classical physics does not allow for solid state transistors.
It seems that quantum computers are required for simulating quantum mechanics in sub-exponential time, though.
When discussing asymptotic algorithmic complexity, you should specify the varying parameter of problem complexity.
The usual default parameter is number of bits it takes to write down the problem. It could also be number of particles. Either one works in this case.
What quantum algorithm for simulating quantum mechanics takes sub-exponential time with respect to the number of particles?
I didn't have a particular algorithm in mind when I said that, but since you ask I went and found this one.