Vladimir_Nesov comments on Why Many-Worlds Is Not The Rationally Favored Interpretation - Less Wrong
You are viewing a comment permalink. View the original post to see all comments and the full post content.
You are viewing a comment permalink. View the original post to see all comments and the full post content.
Comments (98)
The theories actually used in particle physics can generally be obtained by starting with some classical field theory and then "quantizing" it. You go from something described by straightforward differential equations (the classical theory) to a quantum theory on the configuration space of the classical theory, with uncertainty principle, probability amplitudes, and so forth. There is a formal procedure in which you take the classical differential equations and reinterpret them as "operator equations", that describe relationships between the elements of the Schrodinger equation of the resulting quantum field theory.
Many-worlds, being a theory which says that the universal wavefunction is the fundamental reality, starts with a quantum perspective and then tries to find the observable quasi-classical reality somewhere within it. However, given the fact that the quantum theories we actually use have not just a historical but a logical relationship to corresponding classical theories, you can start at the other end and try to understand quantum theory in basically classical terms, only with something extra added. This is what Hadley is doing. His hypothesis is that the rigmarole of quantization is nothing but the modification to probability theory required when you have a classical field theory coupled to general relativity, because microscopic time-loops ("closed timelike curves") introduce certain constraints on the possible behavior of quantities which are otherwise causally disjoint ("spacelike separated"). To reduce it all to a slogan: Hadley's theory is that quantum mechanics = classical mechanics + loops in time.
There are lots of people out there who want to answer big questions in a simple way. Usually you can see where they go wrong. In Hadley's case I can't, nor has anyone else rebutted the proposal. Superficially it makes sense, but he really needs to exactly re-derive the Schrodinger equation somehow, and maybe he can't do that without a much better understanding (than anyone currently possesses) of "non-orientable 4-manifolds". For (to put it yet another way) he's saying that the Schrodinger equation is the appropriate approximate framework to describe the propagation of particles and fields on such manifolds.
Hadley's theory is one member of a whole class of theories according to which complex numbers show up in quantum theory because you're conditioning on the future as well as on the past. I am not aware of any logical proof that complex-valued probabilities are the appropriate formalism for such a situation. But there is an intriguing formal similarity between quantum field theory in N space dimensions and statistical mechanics in N+1 dimensions. It is as if, when you think about initial and final states of an evolving wavefunction, you should think about events in the intermediate space-time volume as having local classically-probabilistic dependencies both forwards and backwards in time - and these add up to chained dependencies in the space-like direction, as you move infinitesimally forward along one light-cone and then infinitesimally backward along another - and the initial and final wavefunctions are boundary conditions on this chunk of space-time, with two components (real and imaginary) everywhere corresponding to forward-in-time and backward-in-time dependencies.
This sort of idea has haunted physics for decades - it's in "Wheeler-Feynman absorber theory", in Aharonov's time-symmetric quantum mechanics (where you have two state vectors, one evolving forwards and one evolving backwards)... and to date it has neither been vindicated nor debunked, as a possible fundamental explanation of quantum theory.
Turning now to your final questions: perhaps it is a little clearer now that you do not need magic to not have many-worlds at the macro level, you need only have an interpretation of micro-level superposition which does not involve two-things-in-the-one-place. Thus, according to these zigzag-in-time theories, micro-level superposition is a manifestation of a weave of causal/probabilistic dependencies oriented in two time directions, into the past and into the future. Like ordinary probability, it's mere epistemic uncertainty, but in an unusual formalism, and in actuality the quantum object is only ever in one state or the other.
Now let's consider Bohm's theory. How does a quantum computer work according to Bohm? As normally understood, Bohm's theory says you have universal wavefunction and classical world, whose evolution is guided by said wavefunction. So a Bohmian quantum computer gets to work because the wavefunction is part of the theory. However, the conceptually interesting reformulation of Bohm's theory is one where the wavefunction is just treated as a law of motion, rather than as a thing itself. The Bohmian law of motion for the classical world is that it follows the gradient of the complex phase in configuration space. But if you calculate that through, for a particular universal wavefunction, what you get is the classically local potential exhibited by the classical theory from which your quantum theory was mathematically derived, and an extra nonlocal potential. The point is that Bohmians do not strictly need to posit wavefunctions at all - they can just talk about the form of that nonlocal potential. So, though no-one has done it, there is going to be a neo-Bohmian explanation for how a quantum computer works in which qubits don't actually go into superposition and the nonlocal dynamics somehow (paging Dr Aaronson...) gives you that extra power.
To round this out, I want to say that my personally preferred interpretation is none of the above. I'd prefer something like this so I can have my neo-monads. In a quasi-classical, space-time-based one-world interpretation, like Hadley's theory or neo-Bohmian theory, Hilbert space is not fundamental. But if we're just thinking about what looks promising as a mathematical theory of physics, then I think those options have to be mentioned. And maybe consideration of them will inspire hybrid or intermediate new theories.
I hope this all makes clear that there is a mountain of undigested complexity in the theoretical situation. Experiment has not validated many-worlds, it has validated quantum mechanics, and many worlds is just one interpretation thereof. If the aim is to "think like reality" - the epistemic reality is that we're still thinking it through and do not know which, if any, is correct.
Could you give a couple of keywords/entry points/references for the zig-zag thingie?
John Cramer and transactional interpretation for by far the most prominent example. Wheeler-Feynman absorber theory was the historical precursor; also see "Feynman checkerboard". Mark Hadley I mentioned. Aharonov-Vaidman for the "two state vector" version of QM, which is in the same territory. Costa de Beauregard was another physicist with ideas in this direction.
This paper is just one place where a potentially significant fact is mentioned, namely that quantum field theory with an imaginary time coordinate (also called "Euclidean field theory" because the metric thereby becomes Euclidean rather than Riemannian) resembles the statistical mechanics of a classical field theory in one higher dimension. See the remark about how "the quantum mechanical amplitude" takes on "the form of a Boltzmann probability weight". A number of calculations in quantum field theory and quantum gravity actually use Euclideanized metrics, but just because the integrals are easier to solve there; then you do an analytic continuation back to Minkowski space and real-valued time. The holy grail for this interpretation, as far as I am concerned, would be to start with Boltzmann and derive quantum amplitudes, because it would mean that you really had justified quantum mechanics as an odd specialization of standard probability theory. But this hasn't been done and perhaps it can't be done.
I think that you mean Euclidean rather than Minkowskian. Euclidean vs Riemannian has to do with whether spacetime is curved (Euclidean no, Riemannian yes), while Euclidean vs Minkowskian has to do with whether the metric treats the time coordinate differently (Euclidean no, Minkowskian yes). (And then the spacetime of classical general relativity, which answers both questions yes, is Lorentzian.)
Also, the book by Huw Price.
That is an excellent book even if one ignores the QM part. (In fact, I found that part the weakest, although perhaps I would understand it better now.)