So I've been trying to read the Quantum Physics sequence. I think I've understood about half of it- I've been rushed, and haven't really sat down and worked through the math. And so I apologize in advance for any mistakes I make here.
It seems like classical mechanics with quantized time is really easy to simulate with a computer: every step, you just calculate force, figure out where velocity is going, then add the change in position to the new position.
Then when you change to relativity, it seems like it's suddenly a lot harder to implement. Whereas classical mechanics are easy on a computer, it seems to me that you would have to set up a system where the outcomes of relativity are explicitly stated, while the classical outcomes are implicit.
The same thing seems to occur, except more, with quantum physics. Continuous wave functions seems to be far harder than discrete particles. Similarly, the whole thing about "no individual particle identity" seems more difficult, although as I think of it now, I suppose this could be because the whole concept of particles is naive.
It doesn't seem like the computation rules simply get harder as we learn more physics. After all, trying to do thermal physics got a lot easier when we started using the ideal gas model.
Also, it's not just that ever improving theories must be ever more difficult to implement on a computer. Imagine that we lived inside Conway's Game of Life. We would figure out all kinds of high level physics, which would be probably way more complex than the eventual B3/S23 which they would discover.
It feels like the actual implemented physics shouldn't much affect how computation works. After all, we live in a quantum universe and classical physics is still simpler to compute.
Is there any value to this speculation?
I presume that you refer to the difference between point particles and fields. The wave function in QM is essentially a classical field), where for every point in space you have a set of numbers describing the quantum state. This state evolved in time according to the time-dependent Schrodinger equation, until "something classical" happens, at which point you have to throw dice to pick one of the preferred states (also known as the wave-function collapse or the MWI world split).
The part of QM which is just the evolution of the Schrodinger equation is computationally equivalent to that of modeling diffusion or heat flow: the equation structure is very similar, with complex numbers instead of reals. The "measurement" part (calculating an outcome) is nearly trivial, just pick one of the outcomes (or one of the worlds, if you are a fan of MWI), according to the Born rule.
While it is true that there are many shortcuts to solving the Schrodinger equation numerically, and a set of these shortcuts is what tends to be studied in most QM courses, there is no substitution for numerical evolution in a general case.
Quantum computing is a rather different beast from the regular quantum physics, just like classical computing is different from classical physics: computing is an abstraction level on top, it lets you think only about the parts of the problem that are relevant, and not worry about the underlying details. Quantum computing is no more about quantum mechanics than the classical computer science is about the physics of silicon gates.
One final piont. It is a general observation that understanding of any scientific topic is greatly enhanced by teaching it, and teaching it to an ultimate idiot savant, a computer, is bound to probe your understanding of the topic extensively. So, if you want to learn QM, get your hands dirty with one of the computational projects like this one, and if you want to learn quantum computing, write a simulation of the Schor's algorithm or something similar.