This article should really be called "Patching the argumentative flaw in the Sequences created by the Quantum Physics Sequence".
There's only one big thing wrong with that Sequence: the central factual claim is wrong. I don't mean the claim that the Many Worlds interpretation is correct; I mean the claim that the Many Worlds interpretation is obviously correct. I don't agree with the ontological claim either, but I especially don't agree with the epistemological claim. It's a strawman which reduces the quantum debate to Everett versus Bohr - well, it's not really Bohr, since Bohr didn't believe wavefunctions were physical entities. Everett versus Collapse, then.
I've complained about this from the beginning, simply because I've also studied the topic and profoundly disagree with Eliezer's assessment. What I would like to see discussed on this occasion is not the physics, but rather how to patch the arguments in the Sequences that depend on this wrong sub-argument. To my eyes, this is a highly visible flaw, but it's not a deep one. It's a detail, a bug. Surely it affects nothing of substance.
However, before I proceed, I'd better back up my criticism. So: consider the existence of single-world retrocausal interpretations of quantum mechanics, such as John Cramer's transactional interpretation, which is descended from Wheeler-Feynman absorber theory. There are no superpositions, only causal chains running forward in time and backward in time. The calculus of complex-valued probability amplitudes is supposed to arise from this.
The existence of the retrocausal tradition already shows that the debate has been represented incorrectly; it should at least be Everett versus Bohr versus Cramer. I would also argue that when you look at the details, many-worlds has no discernible edge over single-world retrocausality:
- Relativity isn't an issue for the transactional interpretation: causality forwards and causality backwards are both local, it's the existence of loops in time which create the appearance of nonlocality.
- Retrocausal interpretations don't have an exact derivation of the Born rule, but neither does many-worlds.
- Many-worlds finds hope of such a derivation in a property of the quantum formalism: the resemblance of density matrix entries to probabilities. But single-world retrocausality finds such hope too: the Born probabilities can be obtained from the product of ψ with ψ*, its complex conjugate, and ψ* is the time reverse of ψ.
- Loops in time just fundamentally bug some people, but splitting worlds have the same effect on others.
I am not especially an advocate of retrocausal interpretations. They are among the possibilities; they deserve consideration and they get it. Retrocausality may or may not be an element of the real explanation of why quantum mechanics works. Progress towards the discovery of the truth requires exploration on many fronts, that's happening, we'll get there eventually. I have focused on retrocausal interpretations here just because they offer the clearest evidence that the big picture offered by the Sequence is wrong.
It's hopeless to suggest rewriting the Sequence, I don't think that would be a good use of anyone's time. But what I would like to have, is a clear idea of the role that "the winner is ... Many Worlds!" plays in the overall flow of argument, in the great meta-sequence that is Less Wrong's foundational text; and I would also like to have a clear idea of how to patch the argument, so that it routes around this flaw.
In the wiki, it states that "Cleaning up the old confusion about QM is used to introduce basic issues in rationality (such as the technical version of Occam's Razor), epistemology, reductionism, naturalism, and philosophy of science." So there we have it - a synopsis of the function that this Sequence is supposed to perform. Perhaps we need a working group that will identify each of the individual arguments, and come up with a substitute for each one.
The framework of Wheeler-Feynman theory is just classical Maxwell electrodynamics with waves that converge on a charged particle as well as waves that spread from a charged particle. So it ought to be just as relativistic and local and deterministic as it usually is, except that now you're interested in solutions that have two oppositely directed arrows of time, rather than just one. (Remember that the equations themselves are time-symmetric, so "emission" of radiation can, in principle, run in either direction.)
In practice, they artificially hacked with the theory to remove self-interactions of particles (particle absorbing its own emissions at a later or earlier time), because that produced incalculable infinite forces; but then they were unable to account for the Lamb shift, which does come from self-interaction; and then somehow Feynman made the leap to path integrals, and in the quantum framework they could deal with the infinities of self-interaction through renormalization.
It may seem like a big leap from the classical to the quantum picture. But classical dynamics can be expressed as wave motion in configuration space via the Hamilton-Jacobi equation, and it's not a big step from the HJE to quantum mechanics. Also, doing anything practical with path integrals usually involves working with classical solutions to the equation of motion, which in the quantum theory have high amplitude, and then looking at corrections which come from neighboring histories.
It's quite conceivable that these quantum deviations from classicality may result from the interference of forward causality and retrocausality. Maybe the Wheeler-Feynman theory just needs some extra ingredient, like micro time loops from general relativity, in order to become consistent. We would be dealing with a single-world model which is locally causal but not globally causal, in the sense that the future would also be shaped by the distribution of micro time loops, and that's not determined by its current state. Our world would be one of an ensemble of self-contained, globally consistent "classical" histories, and the quantum probability calculus (including the Born rule) would just turn out to be how to do probability theory in a world where influences come from the future as well as from the past. For example, the Aharanov "two-state-vector formalism" might show up as the way to do statistical mechanics if you know yourself to be living in such an ensemble. There would be no ontological superpositions. Wavefunctions would just be "probability distributions with a future component".
The status of these speculations is remarkably similar to the status of many worlds. The construction of an exact theory along these lines, with a clear explanation of how it connects to reality, remains elusive, but you can assemble suggestive facts from the quantum formalism to make it plausible, and there is a long tradition of people trying to make it work, one way or another: Wheeler and Feynman, John Cramer, Yakir Aharonov.
Practical QM contains the dualism of wavefunctions and classical observables. Many worlds reifies just the wavefunction and tries to find the observables in it. Retrocausality just keeps the classical part and tries to explain the wavefunction as something to do with forwards and backwards causality. Bohmian mechanics keeps the wavefunction and then fleshes out the classical part in a way governed by the wavefunction. Nomological Bohmian mechanics keeps the classical part of Bohmian mechanics, and replaces the wavefunction with an additional nonlocal potential in the classical equations of motion. If you could obtain that nonlocal potential from a local retrocausal theory, you would finally have an exact, single-world, deterministic explanation of quantum mechanics.