So I've been trying to read the Quantum Physics sequence. I think I've understood about half of it- I've been rushed, and haven't really sat down and worked through the math. And so I apologize in advance for any mistakes I make here.
It seems like classical mechanics with quantized time is really easy to simulate with a computer: every step, you just calculate force, figure out where velocity is going, then add the change in position to the new position.
Then when you change to relativity, it seems like it's suddenly a lot harder to implement. Whereas classical mechanics are easy on a computer, it seems to me that you would have to set up a system where the outcomes of relativity are explicitly stated, while the classical outcomes are implicit.
The same thing seems to occur, except more, with quantum physics. Continuous wave functions seems to be far harder than discrete particles. Similarly, the whole thing about "no individual particle identity" seems more difficult, although as I think of it now, I suppose this could be because the whole concept of particles is naive.
It doesn't seem like the computation rules simply get harder as we learn more physics. After all, trying to do thermal physics got a lot easier when we started using the ideal gas model.
Also, it's not just that ever improving theories must be ever more difficult to implement on a computer. Imagine that we lived inside Conway's Game of Life. We would figure out all kinds of high level physics, which would be probably way more complex than the eventual B3/S23 which they would discover.
It feels like the actual implemented physics shouldn't much affect how computation works. After all, we live in a quantum universe and classical physics is still simpler to compute.
Is there any value to this speculation?
There are a variety of ways of measuring how complex something is. Three of them seem relevant here. The first is, given a sequence a_i of integers, what is the smallest classical Turing machine which on input i outputs a_i. This is really tough in general, and quickly leads to unsolvability issues. In general, the simpler question "what is the smallest Turing machine which outputs string S" is called the Kolmogorov complexity of S and is essentially undecidable in the general case. One can look at empirical Kolmogorov complexity and ask the same question about smallest known. This turns out to be not that useful as I understand it.
More effective methods look not at the Turing machine's size but how much tape space or how much time it takes up. You may have heard of this sort of thing in the context of P?= NP. In this context, the set of function problems which can be performed by a quantum computer reliably in polynomial time is BQP. We do have specific problems that seem to be in BQP that cannot be performed in reasonable time on a classical computer in polynomial time. Although this has not been formally proven, recent work by Scott Aaaronson leads one to suspect the stronger statement that BQP includes problems which are outside what is called the polynomial hierarchy, which is a much broader computational framework. If this is the case, then quantum computing must be very hard.
Note that when talking about BQP one is getting essentially a single bit of information out, so it can always be performed with finite resources (in fact in PSPACE and thus bounded by exponential time). If one wants to predict more behavior of the system then you need do need a lot more resources and it seemed potentially infinite resources to fully predict basic behavior.