This recent blog post strikes me as an interesting instance of a common phenomenon. The phenomenon looks like the following; an intellectual, working within the assumption that the world is not mad, (an assumption not generally found outside of the Anglo-American Enlightenment intellectual tradition) notices that some feature of the world would only make sense if the world was mad. This intellectual responds by denouncing as silly one of the few features of this vale of tears to be, while not intelligently designed, at least structured by generalized evolution rather than by entropy. The key line in the post is
"Conversely in all those disciplines where we have reliable quantatative measurements of progress (with the obvious exception of history) returning to the original works of past great thinkers is decidedly unhelpful."
I agree with the above statement, and find that the post makes a compelling argument for it. My only caveat is that we essentially never have quantitative measures of progress. Even in physics, when one regards not the theory but the technique of actually doing physics, tools and modes of thought rise and fall for reasons of fashion, and once widespread techniques that remain useful fall into disuse.
Other important techniques, like the ones used to invent calculus in the first place, are never adequately articulated by those who use them and thus never come into general use. One might argue that Newton didn't use any technique to invent calculus, just a very high IQ or some other unusual set of biological traits. This, however, doesn't explain why a couple of people invented calculus at about the same time and place, especially given the low population of that time and place compared to the population of China over the many centuries when China was much more civilized than Europe.
It seems likely to me that in cases like the invention of calculus, looking at the use of such techniques can contribute to their development in at least crude form. By analogy, even the best descriptions of how to do martial arts are inadequate to provide expertise without practice, but experience watching experts fight is a valuable complement to training by the relatively inept. If one wants to know the Standard Model, sure, study it directly, but if you want to actually understand how to do the sorts of things that Newton did, you would be advised to read him, Feynman and yes, Plato too, as Plato also did things which contributed greatly to the development of thought.
Anyone who has ever had a serious intellectual following is worth some attention. Repeating errors is the default, so its valuable to look at ideas that were once taken seriously but are now recognized as errors. This is basically the converse of studying past thinkers to understand their techniques.
Outside of physics, the evidence for progress is far weaker. Many current economists think that today we need to turn back to Keynes to find the tools that he developed but which were later abandoned or simply never caught on. A careful reading of Adam Smith and of Ben Franklin reveals them to use tools which did catch on centuries after he published, such as economic models of population growth which would have predicted the "demographic transition" which surprised almost all demographers just recently. Likewise, much in Darwin is part of contemporary evolutionary theory but was virtually unknown by evolutionary biologists half a century ago.
As a practical matter a psychologist who knew the work of William James as well as that of B.F. Skinner or an economist who knows Hayek and Smith as well as Samuelson or Keynes is always more impressive than one who knows only the 'modern' field as 'modern' was understood by the previous generation. Naive induction strongly suggests that like all previous generations of social scientists, today's social scientists who specialize in contemporary theories will be judged by the next generation, who will have an even more modern theory, to be inferior to their more eclectic peers. Ultimately one has to look at the empirical question of the relative per-capita intellectual impressiveness of people who study only condensations and people who study original works. To me, the latter looks much much greater in most fields, OK, in every field that I can quickly think of except for astronomy.
To the eclectic scholar of scholarly madness, progress is real. This decade's sludge contains a few gems that weren't present in the sludge of any previous decade. To the person who assumes that fields like economics or psychology effectively condense the findings of previous generations as background assumptions to today's work, however, progress means replacing one pile of sludge with another fashionable sludge-pile of similar quality. And to those few whom the stars bless with the coworkers of those who study stars? Well I have only looked at astronomy as through a telescope. I haven't seen the details on the ground. That said, for them maybe, just maybe, I can endorse the initial link. But then again, who reads old books of astronomy?
This seems (to me) to be pretty unlikely to be the case. "High processing speed and copious amounts of RAM" would allow more efficient execution of a particular algorithm... but where does that algorithm come from in the first place? One notes that no one taught Newton the "algorithm for inventing calculus". The true algorithm he used, as you pointed out, is likely to have been implemented at a lower level of thought than that of conscious deliberation; if he were still alive today and you asked him how he did it, he might shrug and answer, "I don't know", "It just seemed obvious", or something along those lines. So where did the algorithm come from? I very much doubt that processing speed and RAM alone are enough to come up with a working algorithm good enough to invent calculus from scratch within a single human lifespan, no matter what substrate said algorithm is being run on. (If they were, so-called "AI-complete" problems such as natural language processing would plausibly be much easier to solve.) There is likely some additional aspect to intelligence (pattern-recognition, possibly?) that makes it possible for humans to engage in creative thinking of the sort Newton must have employed to invent calculus; to use Douglas Hofstadter's terminology, "I-mode", not "M-mode". "High IQ", then, would refer to not only increased processing speed and working memory, but also increased pattern-recognition skills. (Raven's Progressive Matrices, anyone?)