This is why chemists tend to build their models up from physics and why biologists to the same with chemistry.
Chemistry via physics doesn't really work without quantum mechanics. This is why chemistry didn't exist until the last 150 years or so, everything before that was just alchemy. Am I getting this right?
And of course, the field has also been slowed down by the nature of calculating the wave function, which is intractable for anything but very simple systems. That's why biology couldn't exist until the invention of supercomputers in the 1970s enabled researchers to approximate the wave functions of organic molecules.
There seems to be a confusion between, and I'll borrow LW terminology here, epistimic reductionism and instrumental reductionism. If you reject the former, Daniel Dennet will jump out of the supplies closet and kick you in the face. Epistemic reductionism is physics students deriving classical equations from quantum mechanics as an exercise.
Instrumental reductionism, on the other hand, is only one tool for actually getting things done and, in practice, many situations involving large, complex systems are better tackled by simpler models that selectively ignore some or all of the "microfoundations" in favor of observing high-level patterns. It is nice, but not required, to be able to prove the accuracy of the high-level rules in terms of low-level laws.
Consider, for instance, Conway's Game of Life. If you have an otherwise empty field containing just a glider gun), do you need to model the state of the field by iterating the entire thing? No, just look at the period of the gun and the speed of the gliders and you can predict the state with much simpler calculations.
TL;DR version: Don't care about microfoundations. Care about tractability and accuracy. Just because a system can be reduced does not mean a reductionist analysis is useful.
I don't think you are disagreeing with me at all. You pretty much sum my point up with this:
Instrumental reductionism, on the other hand, is only one tool for actually getting things done and, in practice, many situations involving large, complex systems are better tackled by simpler models that selectively ignore some or all of the "microfoundations" in favor of observing high-level patterns. It is nice, but not required, to be able to prove the accuracy of the high-level rules in terms of low-level laws.
This sums it up even better:
TL;DR version: Don't care about microfoundations. Care about tractability and accuracy. Just because a system can be reduced does not mean a reductionist analysis is useful.
The only major thing I can think of that you might disagree with is that microfoundations tend to increase accuracy.
FWIW I wasn't talking about epistemic reductionism at all.
I don't think you are disagreeing with me at all.
I don't think so, either, except possibly to quibble about an analogy. Sorry if I wasn't clear on that. I was more attempting to discuss what seems to be the confusion you were responding to and make a more general point, that worrying about high-level models reducing to low-level models is potentially misguided if it's just reductionism for its own sake.
You realize, of course, that no-one's going to get to the TL;DR version unless they didn't think it was too long, and already read it? ;)
I sometimes skim lengthy posts. If something, particularly near the end, catches my eye I go back and reread the whole thing properly. Your mileage may vary.
Fair enough. I guess I was wondering whether it would work better to put it at the top. For some reason I got distracted and forgot to actually mention that in my previous comment. Sorry.
Reductionism isn't necessary for every day use. But it is preferred as it will tell you when things might change. Trends/formulae derived from simpler well tested concepts are more stable.
Take for example the long term trend of world economic growth. It is a good day to day expectation that the world economy is growing, however If we could reduce it to a function of population growth, energy supply growth and new technology we would have a better idea of when we might stop getting the fairly reliable historic growth.
The main reason microeconomics does not [need microfoundations] is that it is built from pretty solid intuitions about how individuals act.
Compare and contrast:
Which is a better foundation for microeconomics?
If you replace von Mises' intuitions with the particular intuitions neoclassical economics is built from ( to the extent that they differ), then it depends on the particular question you are trying to answer. Market activity is approximated reasonably well by the rationality assumption in a variety of cases. Kahnemann and Tversky's evidence that humans are irrational is certainly strong, but in many cases trying to incorporate this reduces tractability to such an extent that it isn't worth it, or at least we don't know how to incorporate it. A good heuristic is to use rationality for long-run phenomena and when possible, use irrationality for the short run.
Even more so, Gary Becker proved in 1962 that you don't need rationality for many of the basic principles of microeconomics to hold. All you need is for each person to have a maximum budget--a noncontroversial assumption if there ever was one.
Many different kinds of non-utility-maximizing behavior and maximizing behavior across nonstandard preferences (sticky actions, bounded rationality, etc) still produce the key results.
in many cases trying to incorporate [irrationality] reduces tractability to such an extent that it isn't worth it, or at least we don't know how to incorporate it.
This is true, but it's also worth emphasising that in many cases, we do have reasonably tractable micro models that incorporate irrationality [ETA: I should instead have said nonstandard preferences; not all of these are necessarily irrational], and they do get used. (I'm not suggesting you disagree with this, I just don't want to give casual non-economist readers the impression that the discipline as a whole blithely ignores such things.)
I posted this comment in reply to a post by David Henderson over at econlog, but first some context.
Mathew Yglesias writes:
To which a commenter replies:
I won't reproduce the whole thing, click through to the comment to see a decent summary of the Lucas Critique if you aren't aware of it already.
Henderson, over at econlog, replies:
And without further adieu, here's my respone:
ack... I should edit my comments better before posting them (notice the use of square brackets).
edit: some minor formatting