Nornagest comments on How minimal is our intelligence? - Less Wrong
You are viewing a comment permalink. View the original post to see all comments and the full post content.
You are viewing a comment permalink. View the original post to see all comments and the full post content.
Comments (214)
I don't think 'epigenetic' means what you think it means. But anyway: yes, there is anthropological evidence of that sort (covered in Pinker's Better Angels and in something of Diamond's, IIRC), and height and mortality are generally believed to correlate with health and presumably then to IQ.
The problem with that is that that is a problem for all theories of civilization formation: if early farming was so much worse than hunter-gathering that we can tell just from the fossils, then why did civilization ever get started? There must have been something compelling or self-sustaining or network effects or something about it.
So, suppose it takes less IQ to maintain a basic civilization than to start one from scratch (as I already suggested in my Africa example), and suppose civilization has some sort of self-reinforcing property where it will force itself to remain in existence even when superior alternatives exist (as it seems it must, factually, given the poorer health of early farmers/civilizationers compared to hunter-gatherers sans civilization).
Then what happened was: over a very long period of time hunter-gatherers slowly accumulated knowledge or tools and IQs rose from better food or perhaps sexual selection or whatever, until finally relatively simultaneously multiple civilizations arose in multiple regions, whereupon the farmer effect reduced their IQ but not enough to overcome the self-sustaining-civilization effect. And then history began.
I tend to think of this by analogy with gene-centered evolution. Just as natural selection selects for genes which are particularly good at reproducing themselves without any special regard for the well-being of their carriers, cultural evolution selects for similarly potent memetic systems without any particular regard for the well-being of the people propagating them.
From skeletal evidence forager lifestyles seem on average a lot healthier, but they also require much lower population densities. You can fit a lot more people per unit area with an agriculturalist lifestyle: if skeletal proxies are to be believed they'll individually be weaker, sicker, and shorter-lived, but they'll be populous enough that the much rarer foragers are going to have trouble displacing them. Cycle that over a few thousand years and eventually civilization ends up ruling the world, with the few remaining foragers pushed into little enclaves where agriculture is unsustainable for one reason or another. We'd occasionally see defections from one lifestyle to the other, but historically they don't seem very common.
The tricky part of this model seems to be figuring out how forager populations self-limit without lowering quality of life to agriculturalist levels. I'm not anthropologist enough to have a definitive answer to this, but I'd speculate that forager resource acquisition isn't as linearly dependent on population as agriculture is: put too many people in a given area and you end up scaring off game, overconsuming food plants, et cetera. Over time I'd expect this to inform territorial behavior and intuitions about optimal group size. Violence is probably also part of the answer.
Or, at least, they end up becoming irrelevant for the same reasons that the agriculturalists won in the first place. If Roanoke disappeared because all of the settlers decided to ditch the farm and live as Indians, there were still way more Europeans coming than the few Europeans that defected, and the new colonists could support a much higher population density than the ones that went native.