Permutation City is an awesome novel that was written in 1994. Even if the author, Greg Egan, used a caricature of this community as a bad guy in a more recent novel, his work is still a major influence on a lot of people around these parts who have read it. It dissolves so many questions around uploading and simulation that it's hard for someone who has read the book to talk about simulationist metaphysics without wanting to reference the novel... but doing that runs into constraints imposed by spoiler etiquette.
So go read Permutation City if you haven't read it already because it's philosophically important and a reasonably fun read.
In the meantime, if you haven't then you should also read A Fire Upon The Deep by Vernor Vinge (of "singularity" coining fame) and then read Eliezer's fan fic The Finale of the Ultimate Meta Mega Crossover which references both of them in interesting ways to make substantive philosophical points and doesn't take too long to read.
In the comments below there will be discussion that has spoilers for all three works.
I appreciate that you chose to "raise the bar" here.
I agree that when we're seeking to 'interpret' a theory like QFT, which is Lorentz-invariant, we ought to postulate an ontology which respects this symmetry. However, from a certain perspective, I think it's obvious that it must be possible to paint a mathematical picture of a Lorentz-invariant "many worlds type" theory.
Let's assume that somehow or other, it's possible to develop QFT from axiomatic foundations, and in such a way that when we take the appropriate low-velocity, low-energy limit, we recover "wavefunctions" and the Schrödinger equation exactly as they appear in QM. As far as I know, QFT and (non-relativistic) QM are, broadly speaking, cut from the same cloth: Both of them make predictions through a process of adding up quantum amplitudes for various possibilities then interpreting the square-norms as probabilities. Neither of them stipulate that there is a fact of the matter about which slit the electron went through, unless you augment them with 'hidden variables'. Neither of them can define what counts as a "measurement". In both theories, the only strictly correct way to compute the probabilities of the results of a second measurement, in advance of a first, is to do a calculation that takes into account all of the possible ways the first measuring device might 'interfere' with the stuff being measured. In practice we don't need to do this - in any remotely reasonable experiment, when some of the degrees of freedom become entangled with a macroscopic measuring device, we can treat them as having "collapsed" and assumed determinate values. But in theory, there's nothing in QM or QFT to rule out macroscopic superpositions (e.g. you can do a "two-slit experiment" with people rather than electrons).
The reason I'm pointing all of these things out is to motivate the following claim: A 'no collapse, no hidden variable' interpretation of QFT is every bit as natural as a 'no collapse, no hidden variable' interpretation of QM. By 'natural' I mean that unless you deliberately 'add something' to the mathematics, you won't get collapses or hidden variables. (The real problem for Everettians is that (prima facie) you won't get Born probabilities either! But we can talk about that later.)
Next, I claim that a 'no collapse, no hidden variable' theory (taken together with the metaphysical assumption that 'the entities and processes described are real, not merely instruments for computing probabilities') is obviously a 'many worlds' theory. This is because it implies that the man over there, listening to a Geiger counter, is constantly splitting into superpositions. Although his superposed selves are overwhelmingly like to carry on their lives independently of each other, there's no limit to how many of them we may need to take account of in order to get our predictions right.
Finally, since the predictions of QFT are Lorentz-invariant, if it can be given a mathematical foundation at all then there must be some way to give it a Lorentz-invariant mathematical foundation.
Putting all of this together, I have my claim.
I'm "cheating" you'll say because I haven't done any hard work - I haven't told you how one can have a Lorentz-invariant ontology of things resembling "wavefunctions". Regretfully I'm not able to do that - if I could I would - but for present purposes I don't think it's necessary.
Personally I'd just say that That Which Exists is whatever it is that, when supplemented with co-ordinates and a position basis, and passing to the nonrelativistic limit, looks like a wavefunction. I don't know whether that has to take the form of a 'universal wavefunction'. (By the way, I don't think the position basis is an 'arbitrary choice' in the same way that a foliation of Minkowski space is arbitrary. This is because of the analogy with classical mechanics, where the elements of a basis correspond to points in a phase space, and changing basis is like changing co-ordinates in the phase space. But a phase space is a symplectic manifold, not an unstructured set. I'm guessing that in quantum theory too there must be some extra structure in the Hilbert space which implies that some bases (or more generally, some Hermitian operators) are "physical" and others not.)
Anyway, I don't really have any idea what a maximally elegant mathematical presentation of the Underlying Reality would look like. I just think it's misleading to use the words "MWI is inconsistent with special relativity" when what you actually mean is that "no-one has yet formulated an axiomatic presentation of QFT". Because the very moment we have the latter, we will (immediately, effortlessly) have a version of MWI that is consistent with SR, simply by making the same interpetative 'moves' that Everett made. (This is a point which the MWI FAQ tries to drive home.) And if we cannot put QFT on firm mathematical foundations then all metaphysically realistic interpretations will suffer equally badly.
Finally, let's switch briefly to the outside view. I've never before seen a critique of MWI made along the lines that it presupposes a notion of absolute simultaneity. Now that could be because I just haven't been looking hard enough (but actually I have read a fair few attacks on MWI), or not understanding what I've been reading (but I think I have, at least in outline), and it could be because almost everyone who writes about this is distracted by whatever agendas and pet theories they have, and missing the more 'obvious' line of attack right under their noses; but I think it's much more likely that you've misunderstood the relation between MWI and the QM-style wavefunction, which isn't as close as you think.
Actually, the overall impression I had prior to this conversation with you is that compatibility with SR is one of MWI's greatest strengths (especially when compared with Bohm's theory.)
Let's move on.
I'd have to do a lot of reading before I could answer this properly. From what I've seen so far, the critical concepts seem to be einselection and decoherence. The universe can be thought of as a collection of interacting 'systems' (such that the Hilbert space of the whole universe is the tensor product of the Hilbert spaces corresponding to each system). When a system interacts with its environment, its reduced density matrix changes exactly as if the environment was performing a 'measurement' on it. However, the environment is much more likely to perform certain 'measurements' than others. Those states which are sufficiently near to being eigenstates of a 'typical measurement' thus have a degree of stability not shared by an arbitrary superposition of such states. In this way, the environment selects a so called "measurement basis" (which consists of what we can call "classical states").
A 'world' is just a component of the universe's wavefunction, when decomposed with respect to this "measurement basis". As I understand it, this notion of 'world' coincides with (or at least is closely related to) the thermodynamic concept of a 'macrostate'. In particular, one can no more (and no less) give a mathematically rigorous definition of 'world' than one can of 'macrostate'. This is just an assumption on my part, but I don't think the "measurement basis" is strictly speaking a basis - it's more like a decomposition of the Hilbert space as a direct sum of subspaces which are themselves fairly large. The indeterminacy ("approximateness") in the notion of world arises from the indeterminacy of what counts as a 'typical' interaction between systems. There may be no "fact of the matter" about whether or not two subspaces ought to be 'merged together' (i.e. whether or not a particular property somewhere between the 'micro' and 'macro' scales deserves to be called 'macroscopic enough' to make the difference between two macrostates).
Possibly I understand your outlook better now. For you, worlds are like hills in a landscape. You're on one of them, there's a local maximum, but there's no need to demarcate exactly where one hill ends and another begins. What is real is the landscape.
You don't experience a whole world, just a small part of one, so you are not epistemically compelled to think of whole worlds as sharply bounded entities. However, if you consider your own place in this ontology, you're not just near a local maximum or on a local maximum, you are a local maximum, and here y... (read more)