In the first half of the 14th century, the Franciscan friar and logician, William of Occam proposed a heuristic for deciding between alternative explanations of physical observables. As William put it: "Entities should not be multiplied without necessity". Or, as Einstein reformulated it 600 years later: "Everything should be made as simple as possible, but not simpler".
Occam's Razor, as it became known, was enthusiastically adopted by the scientific community and remains the unquestioned criterion for deciding between alternative hypotheses to this day. In my opinion, its success is traceable to two characteristics:
o Utility: OR is not a logical deduction. Neither is it a statement about which hypothesis is most likely. Instead, it is a procedure for selecting a theory which makes further work as easy is possible. And by facilitating work, we can usually advance further and faster.
o Combinability. OR is fully compatible with each the epistemological stances which have been adopted within science from time to time (empiricism, rationalism, positivism, falsifiability, etc.)
It is remarkable that such a widely applied principle is exercised with so little thought to its interpretation. I thought of this recently upon reading an article claiming that the multiverse interpretation of quantum mechanics is appealing because it is so simple. Really?? The multiverse explanation proposes the creation of an infinitude of new universes at every instant. To me, that makes it an egregiously complex hypothesis. But if someone decides that it is simple, I have no basis for refutation, since the notion of what it means for a theory to be simple has never been specified.
What do we mean when we call something simple? My naive notion is to begin by counting parts and features. A milling device made up of two stones, one stationary one mobile, fitted with a stick for rotation by hand becomes more complex when we add devices to capture and transmit water power for setting the stone in motion. And my mobile phone becomes more complex each time I add a new app. But these notions don't serve to answer the question whether Lagrange's formulation of classical mechanics, based on action, is simpler than the equivalent formulation by Newton, based on his three laws of forces.
Isn't remarkable that scientists, so renown for their exactitude, have been relying heavily on so vague a principle for 700 years?
Can we do anything to make it more precise?
That's not how algorithmic information theory works. The output tape is not a factor in the complexity of the program. Just the length of the program.
The size of the universe is not a postulate of the QFT or General Relativity. One could derive what a universe containing only two particles would look like using QFT or GR. It's not a fault of the theory that the universe actually contains ~ 10^80 particles†.
People used to think the solar system was the extent of the universe. Just over a century ago, the Milky Way Galaxy was thought to be the extent of the universe. Then it grew by a factor of over 100 Billion when we found that there were that many galaxies. That doesn't mean that our theories got 100 Billion times more complex.
If you take the Many Worlds interpretation and decide to follow the perspective of a single particle as though it were special, Copenhagen is what falls out. You're left having to explain what makes that perspective so special.
† Now we know that the observable universe may only be a tiny fraction of the universe at large which may be infinite. In-fact, there are several different types of multiverse that could exist simultaneously.