More relevant: many textbooks are straightforwardly badly written, to the point that the thirty-year-old conference papers in the citations are actually more accurate. Another factor which the classics-are-screened-off-by-moderns argument may miss is the degree to which poor work reduces the value of a reference.
Would this be a fair summary?
Old books can be useful, but for the old books in a field to be essential reading today, something must have gone badly wrong with the field. Some fields have indeed gone badly wrong.
Yep, except that I'm saying that virtually all fields have gone badly enough wrong for old books to be useful.
That doesn't seem so for mathematics, physics, chemistry...the hard sciences in general. It may be an ornament to one's education to read Euclid, Newton, and Einstein, but it is not necessary. The books that endure in these fields are the exceptionally good textbooks rather than the original works.
in order to recognize systemic errors of your own era it is useful to return to a time before the current dominant paradigm was in effect. even better if you can find first hand accounts of when the current paradigm started becoming fashionable and was regarded as strange and alien.
I disagree with a few points
1 ) Most people do not have enormous amount of time to read, so the question is always if one should NOT read something actual and read a classic instead.
2) People who Do have lots of time end up reading Both actual and classic material, which is probably why you find those who read the classics superior, it's just they are more into it.
3) Academics advise towards reading the classics among other reasons because they have been advised the same way, and chosen the same way, so Choice supportive bias plays a role there.
4) In addition, they prefer that their students read something they are already familiar with than something they themselves will have to become acquainted with in order to judge. It's easier to judge Hegel than Bostrom.
5) Very motivated people tend to lose motivation when not allowed to have their own ideas, and with time become meme-copies of classic people, in part this happens because they are obligated to read Plato, Aristotle, etc... and end up losing faith in the intellectual world. High young achievers such as Feynman, Eliezer, Russell, Kripke, Wittgenstein and others take deep pride in having been outsiders in their studying meth...
Typo - you want "vale of tears", not "veil".
(I'm now on record with several comments like this one. Please let me know if they annoy. It's a quirk of mine that egregious misspellings bias me toward thinking less of the writer and the writing, but it seems to be a widely shared one.)
I think this post overstates the case a bit. My general impression is that the scientific method "wins" even in economics and that later works are better than earlier works.
Now it might be true that the average macro-economist of today understands less than Keynes did but I'd be hard pressed to say that the best don't understand more. Moreover, there are really great distillers. In macro for example, Hicks distilled Keynes into something that I would consider more useful that the original.
Nonetheless, I think it is correct that someone should be ...
An excerpt from the Amazon description of Plausible Reasoning: "This work might have been called "How to Become a Good Guesser"."
Polya's How to Solve It is a great little text he wrote for teachers and students of mathematics. Polya's Mathematics and Plausible Reasoning is even better. There's lots of great problem-solving techniques for non-mathematicians in there too. I recommend it to everyone, it's the best example I've ever seen of someone writing down their techniques.
Edit: cleared up the reference of the quote, it originally look...
One might argue that Newton didn't use any technique to invent calculus, just a very high IQ or some other unusual set of biological traits.
That would be a non-explanation in any case. However high Newton's IQ may have been, his brain was still operating by lawful processes within the physical universe. By the sheer improbability of inventing calculus by chance, there is bound to exist some general technique used by Newton for doing things like inventing calculus, for all that that technique may have been opaque to Newton's own conscious introspection. ...
I dispute that this is a non-explanation. Besides referring to concepts whose existence has already been confirmed by other means, it makes a testable prediction about the degree to which abilities should run in genetic families as opposed to student lineages.
It's a question of which data you're interested in explaining. I'm more interested in understanding the mechanism of how Newton invented calculus than in explaining the (comparatively uninteresting) fact that most other people didn't. (If you want to program an AI to invent calculus, crying "IQ!" isn't going to help.)
[ETA: To be more explicit: the vague hypothesis that "Newton had a high IQ" adequately explains why, given that calculus was invented, Newton was among two people to have invented it. But does a much less effective job of explaining why it was invented in the first place, by anybody.]
(As it happens, most of the world's intellectual power has in fact been spread via students rather than children.)
If academic lineages are due to an ability that teachers have to identify talent, this ability is extremely common and predicts achievement FAR better than IQ tests can. I am struck by the degree to which the financial world fails to identify talent with anything like similar reliability.
Also, the above theory is inconsistent with the extreme intellectual accomplishments of East Asians, and previously Jews, within European culture and failure of those same groups to produce similar intellectual accomplishments prior to such cultural admixture.
I read old books of astronomy and I found it very helpful for understanding new books of astronomy.
Outside of physics, the evidence for progress is far weaker.
The economic growth of the last few decades suggests that some people, somewhere, are gradually getting more things right more often. Those genomes aren't sequencing themselves. Or have I misunderstood you?
progress means replacing one pile of sludge with another fashionable sludge-pile of similar quality.
The methods available to test these various hypotheses seem to have more of an impact on their prominence, than any objective measure of truth. Classical mechanics conformed to observations and could be confirmed by various tests. This led to widespread adoption until, observations were be made that did not fit the theories. Often the theories are available and cover various possible outcomes, all justified by the intuition offered by the current, yet u...
This somewhat echos The Value of Nature and Old Books. Sometimes, older books can be quite effective at explaining things that do not depend on the latest research -- the books by e.g. Knuth, Feynman, Abelson/Susskind are good examples, and I would hearthily recommend those, even if there are newer works on similar subjects.
I'd like to quote this argument from here:
Distillation works best in very exact sciences, such as physics and mathematics. If you rely on distillation for an inexact science, you will do best at capturing its exact parts. You will be left with a systematic bias, and knowledge gap, regarding its inexact parts.
Likewise, much in Darwin is part of contemporary evolutionary theory but was virtually unknown by evolutionary biologists half a century ago.
I disagree with the statement that evolutionary biology isn't making clear progress. I'm guessing you're talking about punctuated equilibrium, which was part of Darwin's On the Origin of Species (albeit not by that name), deemphasized by later evolutionary biologists, and later assertively brought back by Gould et al. However, this hypothesis is only vacillating in and out of 'style' because it 1) has scientifi...
I have to admit that personally I don't see a golden thread in the post. What was the core argument? As far as I understood it the pot reasons about "relative per-capita intellectual impressiveness of people who study only condensations and people who study original works".
Which is... to be honest, just a mockup. Who cares about the "impressiveness" while studying? Why should one optimize "impressiveness" in ones study?
Personally I think that original works carry a lot of baggage. For example the language is older, the theories sometimes incredibly outdate
...Books are written sometimes about "The Great Ideas Of The Past", sometimes about that great thinker of former times, and the public reads these books written by a someone else, but not the works of "The Great Man or Woman" himself/herself.
There is nothing that so greatly recreates the mind as the works of the old classic writers. Directly one has been taken up, even if it is only for half-an-hour, one feels as quickly refreshed, relieved, purified, elevated, and strengthened as if one had refreshed oneself at a mountain stream.
One can ...
"A good analysis book doesn’t summarize Newton it digests his insights and presents them as part of a grander theory. "
Exactly. And I want to be in charge of doing that for myself, so I suppose I'll continue to read original sources.
Finally I just want to say that surely you don't disagree that there is something different about what happens in physics than what happens in astrology do you? I don't care about deep principled distinctions here but just at a purely practical level physics (and the other sciences) let us make strictly more things now than they did 10, 50 or 100 years ago.
The notion of progress I had in mind is much much weaker than yours. I just mean that sometimes we discover shit that we find very useful (transistor technology) and that the useful consequences of sci...
Ultimately, however, the aim of my post was to establish that there isn't some kind of important knowledge best gained through the reading of original sources. The target of my argument was the frequently given argument that somehow spurning these great original works puts you at some kind of 'objective' disadvantage in terms of learning/knowledge relative to those who do. Sure these are fuzzy terms and I think most of them aren't even really meaningful but the idea the advocates of this position have in mind is that somehow reading literature classics a...
Very weakly related to the post: I surprised Eliezer Yudkowsky last October with a quote showing off Galileo's rationality.
You make decent points about the lack of evidence for 'progress' in methodology. I think it's quite possible that we don't significantly improve the process by which we go from the current best theory to it's successor. Of course to make sense of this notion you would need a more precise notion of what it means to have a better methodology for generating scientific theories. I mean the first natural way to do this might be to somehow try and measure the percent of the physical world we can explain/predict from initial conditions (many complications with...
This recent blog post strikes me as an interesting instance of a common phenomenon. The phenomenon looks like the following; an intellectual, working within the assumption that the world is not mad, (an assumption not generally found outside of the Anglo-American Enlightenment intellectual tradition) notices that some feature of the world would only make sense if the world was mad. This intellectual responds by denouncing as silly one of the few features of this vale of tears to be, while not intelligently designed, at least structured by generalized evolution rather than by entropy. The key line in the post is
I agree with the above statement, and find that the post makes a compelling argument for it. My only caveat is that we essentially never have quantitative measures of progress. Even in physics, when one regards not the theory but the technique of actually doing physics, tools and modes of thought rise and fall for reasons of fashion, and once widespread techniques that remain useful fall into disuse.
Other important techniques, like the ones used to invent calculus in the first place, are never adequately articulated by those who use them and thus never come into general use. One might argue that Newton didn't use any technique to invent calculus, just a very high IQ or some other unusual set of biological traits. This, however, doesn't explain why a couple of people invented calculus at about the same time and place, especially given the low population of that time and place compared to the population of China over the many centuries when China was much more civilized than Europe.
It seems likely to me that in cases like the invention of calculus, looking at the use of such techniques can contribute to their development in at least crude form. By analogy, even the best descriptions of how to do martial arts are inadequate to provide expertise without practice, but experience watching experts fight is a valuable complement to training by the relatively inept. If one wants to know the Standard Model, sure, study it directly, but if you want to actually understand how to do the sorts of things that Newton did, you would be advised to read him, Feynman and yes, Plato too, as Plato also did things which contributed greatly to the development of thought.
Anyone who has ever had a serious intellectual following is worth some attention. Repeating errors is the default, so its valuable to look at ideas that were once taken seriously but are now recognized as errors. This is basically the converse of studying past thinkers to understand their techniques.
Outside of physics, the evidence for progress is far weaker. Many current economists think that today we need to turn back to Keynes to find the tools that he developed but which were later abandoned or simply never caught on. A careful reading of Adam Smith and of Ben Franklin reveals them to use tools which did catch on centuries after he published, such as economic models of population growth which would have predicted the "demographic transition" which surprised almost all demographers just recently. Likewise, much in Darwin is part of contemporary evolutionary theory but was virtually unknown by evolutionary biologists half a century ago.
As a practical matter a psychologist who knew the work of William James as well as that of B.F. Skinner or an economist who knows Hayek and Smith as well as Samuelson or Keynes is always more impressive than one who knows only the 'modern' field as 'modern' was understood by the previous generation. Naive induction strongly suggests that like all previous generations of social scientists, today's social scientists who specialize in contemporary theories will be judged by the next generation, who will have an even more modern theory, to be inferior to their more eclectic peers. Ultimately one has to look at the empirical question of the relative per-capita intellectual impressiveness of people who study only condensations and people who study original works. To me, the latter looks much much greater in most fields, OK, in every field that I can quickly think of except for astronomy.
To the eclectic scholar of scholarly madness, progress is real. This decade's sludge contains a few gems that weren't present in the sludge of any previous decade. To the person who assumes that fields like economics or psychology effectively condense the findings of previous generations as background assumptions to today's work, however, progress means replacing one pile of sludge with another fashionable sludge-pile of similar quality. And to those few whom the stars bless with the coworkers of those who study stars? Well I have only looked at astronomy as through a telescope. I haven't seen the details on the ground. That said, for them maybe, just maybe, I can endorse the initial link. But then again, who reads old books of astronomy?