This recent blog post strikes me as an interesting instance of a common phenomenon. The phenomenon looks like the following; an intellectual, working within the assumption that the world is not mad, (an assumption not generally found outside of the Anglo-American Enlightenment intellectual tradition) notices that some feature of the world would only make sense if the world was mad. This intellectual responds by denouncing as silly one of the few features of this vale of tears to be, while not intelligently designed, at least structured by generalized evolution rather than by entropy. The key line in the post is
"Conversely in all those disciplines where we have reliable quantatative measurements of progress (with the obvious exception of history) returning to the original works of past great thinkers is decidedly unhelpful."
I agree with the above statement, and find that the post makes a compelling argument for it. My only caveat is that we essentially never have quantitative measures of progress. Even in physics, when one regards not the theory but the technique of actually doing physics, tools and modes of thought rise and fall for reasons of fashion, and once widespread techniques that remain useful fall into disuse.
Other important techniques, like the ones used to invent calculus in the first place, are never adequately articulated by those who use them and thus never come into general use. One might argue that Newton didn't use any technique to invent calculus, just a very high IQ or some other unusual set of biological traits. This, however, doesn't explain why a couple of people invented calculus at about the same time and place, especially given the low population of that time and place compared to the population of China over the many centuries when China was much more civilized than Europe.
It seems likely to me that in cases like the invention of calculus, looking at the use of such techniques can contribute to their development in at least crude form. By analogy, even the best descriptions of how to do martial arts are inadequate to provide expertise without practice, but experience watching experts fight is a valuable complement to training by the relatively inept. If one wants to know the Standard Model, sure, study it directly, but if you want to actually understand how to do the sorts of things that Newton did, you would be advised to read him, Feynman and yes, Plato too, as Plato also did things which contributed greatly to the development of thought.
Anyone who has ever had a serious intellectual following is worth some attention. Repeating errors is the default, so its valuable to look at ideas that were once taken seriously but are now recognized as errors. This is basically the converse of studying past thinkers to understand their techniques.
Outside of physics, the evidence for progress is far weaker. Many current economists think that today we need to turn back to Keynes to find the tools that he developed but which were later abandoned or simply never caught on. A careful reading of Adam Smith and of Ben Franklin reveals them to use tools which did catch on centuries after he published, such as economic models of population growth which would have predicted the "demographic transition" which surprised almost all demographers just recently. Likewise, much in Darwin is part of contemporary evolutionary theory but was virtually unknown by evolutionary biologists half a century ago.
As a practical matter a psychologist who knew the work of William James as well as that of B.F. Skinner or an economist who knows Hayek and Smith as well as Samuelson or Keynes is always more impressive than one who knows only the 'modern' field as 'modern' was understood by the previous generation. Naive induction strongly suggests that like all previous generations of social scientists, today's social scientists who specialize in contemporary theories will be judged by the next generation, who will have an even more modern theory, to be inferior to their more eclectic peers. Ultimately one has to look at the empirical question of the relative per-capita intellectual impressiveness of people who study only condensations and people who study original works. To me, the latter looks much much greater in most fields, OK, in every field that I can quickly think of except for astronomy.
To the eclectic scholar of scholarly madness, progress is real. This decade's sludge contains a few gems that weren't present in the sludge of any previous decade. To the person who assumes that fields like economics or psychology effectively condense the findings of previous generations as background assumptions to today's work, however, progress means replacing one pile of sludge with another fashionable sludge-pile of similar quality. And to those few whom the stars bless with the coworkers of those who study stars? Well I have only looked at astronomy as through a telescope. I haven't seen the details on the ground. That said, for them maybe, just maybe, I can endorse the initial link. But then again, who reads old books of astronomy?
No, I'm not suggesting that. That may be what Okeymaker is suggesting; I'm not quite clear on his/her distinction either. What I was originally addressing, however, was komponisto's assertion that "high IQ" is merely "high processing speed and copious amounts of RAM", which I denied, pointing out that "high processing speed and copious amounts of RAM" alone would surely not have been enough to invent calculus, and that "creative thinking" (whatever that means) is required as well. In essence, I was arguing that "high IQ" should be defined as more than simply "high processing speed and copious amounts of RAM", but should include some tertiary or possibly even quaternary component to account for thinking of the sort Newton must have performed to invent calculus. This suggested definition of IQ seems more reasonable to me; after all, if IQ were simply defined as "high processing speeed and copious amounts of RAM", I doubt researchers would have had so much trouble testing for it. Furthermore, it's difficult to imagine tests like Raven's Progressive Matrices (which are often used in IQ testing) being completed by dint of sheer processing speed and RAM.
Note that the above paragraph contains no mention of the words "natural", "innate", or any synonyms. The distinction between "natural" thinking and "synthetic" (I guess that would be the word? I was trying to find a good antonym for "natural") thinking was not what I was trying to get at with my original comment; indeed, I suspect that the concept of such a distinction may not even be coherent. Furthermore, conditional on such a distinction existing, I would not sort "creative thinking" into the "synthetic" category of thinking; as I noted in my original comment, no one taught Newton the algorithm he used to invent calculus. It was probably opaque even to his own conscious introspection, probably taking the form of a brilliant flash of insight or something like that, after which he just "knew" the answer, without knowing how he "knew". This sort of thinking, I would say, is so obviously spontaneous and untaught that I would not hesitate to classify it as "natural"--if, that is, the concept is indeed coherent.
It sounds as though you may be confused because you have been considering Okeymaker's and my positions to be one and the same. In light of this, I think I should clarify that I simply offered my comment as a potential explanation of what Okeymaker meant by "creative thinking"; no insight was meant to be offered on his/her distinction between "natural" thinking and "synthetic" thinking.
This shows that you didn't understand what I was arguing, because you are in fact agreeing with me.
The structure of my argument was:
(1) People say that high IQ is the reason Newton invented calculus.
(2) However, high IQ is just high pro... (read more)