My understanding is that pilot wave theory (ie Bohmian mechanics) explains all the quantum physics with no weirdness like "superposition collapse" or "every particle interaction creates n parallel universes which never physically interfere with each other". It is not fully "local" but who cares?
Is there any reason at all to expect some kind of multiverse? Why is the multiverse idea still heavily referenced (eg in acausal trade posts)?
Edit April 11: I challenge the properly physics brained people here (I am myself just a Q poster) to prove my guess wrong: Can you get the Born rule with clean hands this way?
They also implicitly claim that in order for the Born rule to work [under pilot wave], the particles have to start the sim following the psi^2 distribution. I thinkk this is just false, and eg a wide normal distribution will converge to psi^2 over time as the system evolves. (For a non-adversarially-chosen system.) I don't know how to check this. Has someone checked this? Am I looking at this right?
Edit April 9: Well pilot wave vs many worlds is a holy war topic. People have pointed out excellent non-holy-war material:
- Perhaps just an infinite universe gives you the same philosophical conclusions/feels as many worlds? Who has already thought that idea through?
- Some of the stuff Wikipedia mentions relating to the "many universes different constants" idea (level 2 here) sounds like it might actually have a little rigor?? How to tell?? (These are optimized by the publishing system to sound like they would have rigor.)
Amount of calculation isn't so much the concern here as the amount of bits used to implement that calculation. And there's no law that forces the amount of bits encoding the computation to be equal. Copenhagen can just waste bits on computations that MWI doesn't have to do.
In particular, I mentioned earlier that Copenhagen has to have rules for when measurements occur and what basis they occur in. How does MWI incur a similar cost? What does MWI have to compute that Copenhagen doesn't that uses up the same number of bits of source code?
Like, yes, an expected-value-maximizing agent that has a utility function similar to ours might have to do some computations that involve identifying worlds, but the complexity of the utility function doesn't count against the complexity of any particular theory. And an expected value maximizer is naturally going to try and identify its zone of influence, which is going to look like a particular subset of worlds in MWI. But this happens automatically exactly because the thing is an EV-maximizer, and not because the laws of physics incurred extra complexity in order to single out worlds.