mlionson comments on Many Worlds, One Best Guess - Less Wrong
You are viewing a comment permalink. View the original post to see all comments and the full post content.
You are viewing a comment permalink. View the original post to see all comments and the full post content.
Comments (75)
I think I see where we are disagreeing.
Consider a quantum computer. If the laws of physics say that only our lack of knowledge limits the amount of complexity in a superposition, and the logic of quantum computation suggests that greater complexity of superposition leads to exponentially increased computational capacity for certain types of computation, then it will be quite possible to have a quantum computer sit on a desktop and make more calculations per second than there are atoms in the universe. My quote above from David Deutsch makes that point. Only the limitations of our current knowledge prevent that.
When we have larger quantum computers, children will be programming universes with all the richness and diversity of our own, and no one will be arguing about the reality of the multiverse. If the capacity for superposition is virtually limitless, the exponential possibilities are virtually limitless. But so will be the capacity to measure “counterfactual” states that are more and more evolved, like dead cats with lower body temperatures. Why will the body temperature be lower? Why will the cat in that universe not (usually) be coming back to life?
As you state, because of the laws of thermodynamics. With greater knowledge on our part, the exponential increase in computational capacity of the quantum computer will parallel the exponential increase in our ability to measure states that are decohering from our own and are further evolved, using what you call the “Everett camera”. I say “decohering from” rather than “decoherent from” because there is never a time when these states are completely thermodynamically separated. And the state vector has unitary evolution. We would not expect it to go backwards any more than you would expect to see your own cat at home go from a dead to an alive state.
I am afraid that whether we use an Everett camera or one supplied to us by evolution (our neuropsychological apparatus) we are always interpreting reality through the lens of our theories. Often these theories are useful from an evolutionary perspective but nonetheless misleading. For example, we are likely to perceive that the world is flat, absent logic and experiment. It is equally easy to miss the existence of the multiverse because of the ruse of positivism. “I didn’t see the needle penetrate the skin in your quantum experiment. It didn’t or (even worse!) can't happen.” But of course when we do this experiment with standard needles, we never truly see the needle go in, either.
I have enjoyed this discussion.
I do think exponential parallelism is a good description of QC, because any adequate causal model of a quantum computation will invoke an exponential number of nodes in the explanation of the computation's output. Even if we can't always take full advantage of the exponential number of calculations being performed, because of the readout problem, it is nonetheless only possible to explain quantum readouts in general by postulating that an exponential number of parallel calculations went on behind the scenes.
Here, of course, "causal model" is to be taken in the technical Pearl sense of the term, a directed acyclic graph of nodes each of whose values can be computed from its parent nodes plus a background factor of uncertainty that is uncorrelated to any other source of uncertainty, etc. I specify this to cut off any attempt to say something like "well, but those other worlds don't exist until you measure them". Any formal causal model that explains the quantum computation's output will need an exponential number of nodes, since those nodes have real, causal effects on the final probability distribution over outputs.