Gwern suggested that, if it were possible for civilization to have developed when our species had a lower IQ, then we'd still be dealing with the same problems, but we'd have a lower IQ with which to tackle them. Or, to put it another way, it is unsurprising that living in a civilization has posed problems that our species finds difficult to tackle, because if we were capable of solving such problems easily, we'd probably also have been capable of developing civilization earlier than we did.
How true is that?
In this post I plan to look in detail at the origins of civilization with an eye to considering how much the timing of it did depend directly upon the IQ of our species, rather than upon other factors.
Although we don't have precise IQ test numbers for our immediate ancestral species, the fossil record is good enough to give us a clear idea of how brain size has changed over time:
and we do have archaeological evidence of approximately when various technologies (such as pictograms, or using fire to cook meat) became common.
The First City
About 6,000 years ago (4000 BCE), Ur was a thriving trading village on the flood plain near the mouth of the river Euphrates in what is now called southern Iraq and what historians call Sumeria.
By 3000 BCE it was the heart of a city-state with a core built up populated area covering 37 acres, and would go on over the following thousand years to lead the Sumerian empire, raise a great brick Ziggurat to its patron moon goddess, and become the largest city in the world (65,000 people concentrated in 54 acres).
It was eventually doomed by desertification and soil salination, caused by its own success (over-grazing and land clearing) but, by then, cities had spread throughout the fertile crescent of rivers at the intersection of the European, African and Asian land masses.
Ur may not have been the first city, but it was the first one we know of that wasn't part of a false dawn - one whose culture and technologies did demonstrably spread to other areas. It was the flashpoint.
We don't know for certain what it was about the culture surrounding the dawn of cities that made that particular combination of trade, writing, specialisation, hierarchy and religion communicable, when similar cultures from previous false dawns failed to spread. We can trace each of those elements to earlier sources, none of them were original to Ur, so perhaps it was a case of a critical mass achieving a self-sustaining reaction.
What we can look at is why the conditions to allow a village to become a large enough city for such a critical mass of developments to accumulate, occurred at that time and place.
From Village to City
Motivation aside, the chief problem with sustaining large numbers of people together in a small area, over several generations, keeping them healthy enough for the population to grow without continual immigration, is ensuring access to a scalable renewable predictable source of calories.
To be predictable means surviving famine years, which requires crops that can be stored for several years, such as grasses (wheat, barley and millet) with large seeds, and good storage facilities to store them in. It also means surviving pestilence, which requires having a variety of such crops. To be scalable and renewable means supplying water and nutrients to those crops on an ongoing basis, which requires irrigation and fertiliser from domesticated animals (if you don't have handy regular floods).
Having large mammals available to domesticate, who can provide fertiliser and traction (pulling ploughs and harrows) certainly makes things easier, but doesn't seem to have been a large factor in the timing of the rise of civilisation, or particularly dependent upon the IQ of the human species. Research suggests that domestication may have been driven as much by the animals own behaviour as by human intention, with those animals daring to approach humans more closely getting first choice of discarded food.
Re-planting seeds to ensure plants to gather in following years, leading to low nutrition grasses adapting into grains with high protein concentrations in the seeds, does seem to a mainly intentional human activity in that we can trace most of the gain in size of such plant species seeds to locations where humans have transitioned from the palaeolithic hunter-gatherer culture (about 2.5 million years ago, to about 10,000 years ago) to the neolithic agricultural culture (about 10,000 year ago, onwards).
Good grain storage seems to have developed incrementally starting with crude stone silo pit designs in 9500 BCE, and progressing by 6000 BCE to customised buildings with raised floors and sealed ceramic containers which could store 80 tons of wheat in good condition for 4 years or more. (Earthenware ceramics date to 25,000 BCE and earlier, though the potter's wheel, useful for mass production of regular storage vessels, does date to the Ubaid period.)
The main key to the timing of the transition from village to city seems to have been not human technology but the confluence of climate and biology. Jared Diamond points the finger at the geography of the region - the fertile crescent farmers had access to a wider variety of grains than anywhere else in the world because that area links and has access to the species of three major land masses. The Mediterranean climate has a long dry season with a short period of rain, which made it ideal for growing grains (which are much easier to store for several years than, for instance bananas). And everything kicked off when the climate stabilised after the most recent ice age ended about 12,000 years ago.
Ice Ages
Strictly speaking, we're actually talking about the end of a "glacial period" rather than the end of an entire "ice age". The timeline goes:
200,000 years ago - 130,000 years ago : glacial period
130,000 years ago - 110,000 years ago : interglacial period
110,000 years ago - 12,000 years ago : glacial period
12,000 years ago - present : interglacial period
So the question now is, why didn't humanity spawn civilisation in the fertile crescent 130,000 years ago, during the last interglacial period? Why did it happen in this one? Did we get significantly brighter in the mean time?
It isn't, on the face of it, an implausible idea. 100,000 years is long enough for evolutionary change to happen, and maybe inventing pottery or becoming farmers did take more brain power than humanity had back then. Or, if not IQ, perhaps it was some other mental change like attention span, or the capacity to obey written laws, live as a specialist in a hierarchy, or similar.
But there's no evidence that this is the case, nor is there a need to hypothesise it because there is at least one genetic change we do know about during that time period, that is by itself sufficient to explain the lack of civilisation 130,000 years ago. And it has nothing to do with the brain.
Brains, Genes and Calories
Using the San Bushpeople as a guide to the palaeolithic diet, hunter-gather culture was able to support an average population density of one person per acre. Not that they ate badly, as individuals. Indeed, they seem to have done better than the early Neolithic farmers. But they had to be free to wander to follow nomadic food sources, and they were limited by access to food that the human body could use to create Docosahexaenoic acid, which is a fatty acid required for human brain development. Originally humans got this from fish living in the lakes and rivers of central Africa. However, about 80,000 years ago, we developed a gene that let us synthesise the same acid from other sources, freeing humanity to migrate away from the wet areas, past the dry northern part, and out into the fertile crescent.
But there is a link between diet and brain. Although the human brain represents only 2% of the body weight, it receives 15% of the cardiac output, 20% of total body oxygen consumption, and 25% of total body glucose utilization. Brains are expensive, in terms of calories consumed. Although brain size or brain activity that uses up glucose is not linearly related to individual IQ, they are linked on a species level.
IQ is polygenetic, meaning that many different genes are relevant to a person's potential maximum IQ. (Note: there are many non-genetic factors that may prevent an individual reaching their potential). Algernon's Law suggests that genes affecting IQ that have multiple alleles still common in the human population are likely to have a cost associated with the alleles tending to increase IQ, otherwise they'd have displaced the competing alleles. In the same way that an animal species that develops the capability to grow a fur coat in response to cold weather is more advanced than one whose genes strictly determine that it will have a thick fur coat at all times, whether the weather is cold or hot; the polygenetic nature of human IQ gives human populations the ability to adapt and react on the time scale of just a few generations, increasing or decreasing the average IQ of the population as the environment changes to reduce or increase the penalties of particular trade-offs for particular alleles contributing to IQ. In particular, if the trade-off for some of those alleles is increased energy consumption and we look at a population of humans moving from an environment where calories are the bottleneck on how many offspring can be produced and survive, to an environment where calories are more easily available, then we might expect to see something similar to the Flynn effect.
Summary
There is no cause to suppose, even if the human genome 100,000 years ago had the full set of IQ-related-alleles present in our genome today, that they would have developed civilisation much sooner.
Comment Navigation Aide
link - DuncanS - animal vs human intelligence
link - DuncanS - brain size & brain efficiency
link - JaySwartz - adaptability vs intelligence
link - RichardKennaway - does more intelligence tend to bring more societal happiness?
link - mrglwrf - Ur vs Uruk
link - NancyLebovitz - does decreased variance of intelligence tend to bring more societal happiness?
link - fubarobfusco - victors writing history
link -- consequentialist treatment of library burning
link -- the average net contribution to society of people working in academia
link - John_Maxwell_IV - independent development of civilisation in the Americas
link - shminux - How much of our IQ is dependant upon Docosahexaenoic acid?
link - army1987 - implications for the Great Filter
link - Vladimir_Nesov - genome vs expressed IQ
link - Vladimir_Nesov - Rhetorical nitpick
link - Vaniver - IQ & non-processor-speed components of problem solving
link - JoshuaZ - breakthroughs don't tend to require geniuses in order to be made
link - Desrtopa - cultural factorors
If the gene for the synthesis of docosahexaenoic acid arose 80kya, and the current interglacial period began 12kya, that still leaves four thousand years between the end of the glacial period and the beginning of city-based civilization, which, keep in mind, is a long time.
If the civilization developments followed within a hundred years or so of the necessary biological and environmental factors coming into place, I wouldn't be so skeptical that our intelligence already exceeded the minimum necessary to produce those developments. But we already had domesticated grazing animals thousands of years before the foundation of Ur, and grains earlier than that. Don't forget that when we're dealing with cultural rather than biological evolution, a millenium is no longer a relative eyeblink.
Humans are relatively conformist, and we often have a hard time translating abstract/revolutionary ideas in to practice. It seems likely that many humans had ideas for things resembling civilization, or things that could've lead to the development of a civilization, before the first actual civilization, in the same way more people dream about starting businesses than actually start businesses.
Paul Graham seems to think that local culture plays a huge role in startup success. Now consider that even the cultures Paul Graham considers pretty bad are still A... (read more)