Eliezer recently posted an essay on "the fallacy of privileging the hypothesis". What it's really about is the fallacy of privileging an arbitrary hypothesis. In the fictional example, a detective proposes that the investigation of an unsolved murder should begin by investigating whether a particular, randomly chosen citizen was in fact the murderer. Towards the end, this is likened to the presumption that one particular religion, rather than any of the other existing or even merely possible religions, is especially worth investigating.
However, in between the fictional and the supernatural illustrations of the fallacy, we have something more empirical: quantum mechanics. Eliezer writes, as he has previously, that the many-worlds interpretation is the one - the rationally favored interpretation, the picture of reality which rationally should be adopted given the empirical success of quantum theory. Eliezer has said this before, and I have argued against it before, back when this site was just part of a blog. This site is about rationality, not physics; and the quantum case is not essential to the exposition of this fallacy. But given the regularity with which many-worlds metaphysics shows up in discussion here, perhaps it is worth presenting a case for the opposition.
We can do this the easy way, or the hard way. The easy way is to argue that many-worlds is merely not favored, because we are nowhere near being able to locate our hypotheses in a way which permits a clean-cut judgment about their relative merits. The available hypotheses about the reality beneath quantum appearances are one and all unfinished muddles, and we should let their advocates get on with turning them into exact hypotheses without picking favorites first. (That is, if their advocates can be bothered turning them into exact hypotheses.)
The hard way is to argue that many-worlds is actually disfavored - that we can already say it is unlikely to be true. But let's take the easy path first, and see how things stand at the end.
The two examples of favoring an arbitrary hypothesis with which we have been provided - the murder investigation, the rivalry of religions - both present a situation in which the obvious hypotheses are homogeneous. They all have the form "Citizen X did it" or "Deity Y did it". It is easy to see that for particular values of X and Y, one is making an arbitrary selection from a large set of possibilities. This is not the case in quantum foundations. The well-known interpretations are extremely heterogeneous. There has not been much of an effort made to express them in a common framework - something necessary if we want to apply Occam's razor in the form of theoretical complexity - nor has there been much of an attempt to discern the full "space" of possible theories from which they have been drawn - something necessary if we really do wish to avoid privileging the hypotheses we happen to have. Part of the reason is, again, that many of the known options are somewhat underdeveloped as exact theories. They subsist partly on rhetoric and handwaving; they are mathematical vaporware. And it's hard to benchmark vaporware.
In his latest article, Eliezer presents the following argument:
"... there [is] no concrete evidence whatsoever that favors a collapse postulate or single-world quantum mechanics. But, said Scott, we might encounter future evidence in favor of single-world quantum mechanics, and many-worlds still has the open question of the Born probabilities... There must be a trillion better ways to answer the Born question without adding a collapse postulate..."
The basic wrong assumption being made is that quantum superposition by default equals multiplicity - that because the wavefunction in the double-slit experiment has two branches, one for each slit, there must be two of something there - and that a single-world interpretation has to add an extra postulate to this picture, such as a collapse process which removes one branch. But superposition-as-multiplicity really is just another hypothesis. When you use ordinary probabilities, you are not rationally obligated to believe that every outcome exists somewhere; and an electron wavefunction really may be describing a single object in a single state, rather than a multiplicity of them.
A quantum amplitude, being a complex number, is not an ordinary probability; it is, instead, a mysterious quantity from which usable probabilities are derived. Many-worlds says, "Let's view these amplitudes as realities, and try to derive the probabilities from them." But you can go the other way, and say, "Let's view these amplitudes as derived from the probabilities of a more fundamental theory." Mathematical results like Bell's theorem show that this will require a little imagination - you won't be able to derive quantum mechanics as an approximation to a 19th-century type of physics. But we have the imagination; we just need to use it in a disciplined way.
So that's the kernel of the argument that many worlds is not favored: the hypotheses under consideration are still too much of a mess to even be commensurable, and the informal argument for many worlds, quoted above, simply presupposes a multiplicity interpretation of quantum superposition. How about the argument that many worlds is actually disfavored? That would become a genuinely technical discussion, and when pressed, I would ultimately not insist upon it. We don't know enough about the theory-space yet. Single-world thinking looks more fruitful to me, when it comes to sub-quantum theory-building, but there are versions of many-worlds which I do occasionally like to think about. So the verdict for now has to be: not proven; and meanwhile, let a hundred schools of thought contend.
John Cramer and transactional interpretation for by far the most prominent example. Wheeler-Feynman absorber theory was the historical precursor; also see "Feynman checkerboard". Mark Hadley I mentioned. Aharonov-Vaidman for the "two state vector" version of QM, which is in the same territory. Costa de Beauregard was another physicist with ideas in this direction.
This paper is just one place where a potentially significant fact is mentioned, namely that quantum field theory with an imaginary time coordinate (also called "Euclidean field theory" because the metric thereby becomes Euclidean rather than Riemannian) resembles the statistical mechanics of a classical field theory in one higher dimension. See the remark about how "the quantum mechanical amplitude" takes on "the form of a Boltzmann probability weight". A number of calculations in quantum field theory and quantum gravity actually use Euclideanized metrics, but just because the integrals are easier to solve there; then you do an analytic continuation back to Minkowski space and real-valued time. The holy grail for this interpretation, as far as I am concerned, would be to start with Boltzmann and derive quantum amplitudes, because it would mean that you really had justified quantum mechanics as an odd specialization of standard probability theory. But this hasn't been done and perhaps it can't be done.
I think that you mean Euclidean rather than Minkowskian. Euclidean vs Riemannian has to do with whether spacetime is curved (Euclidean no, Riemannian yes), while Euclidean vs Minkowskian has to do with whether the metric treats the time coordinate differently (Euclidean no, Minkowskian yes). (And then the spacetime of classical general relativity, which answers both questions yes, is Lorentzian.)