I don't think there's a single defining point of difference, but I tend to think of it as the difference between the traditional social standard of having beliefs you can defend and the stricter individual standard of trying to believe as accurately as possible.
The How to Have a Rational Discussion flowchart is a great example of the former: the question addressed there is whether you are playing by the rules of the game. If you are playing by the rules and can defend your beliefs, great, you're OK! This is how we are built to reason.
X-rationality emphasizes having accurate beliefs over having defensible beliefs. If you fail to achieve a correct answer, it is futile to protest that you acted with propriety. Instead of asking "does this evidence allow me to keep my belief or oblige me to give it up?", it asks "what is the correct level of confidence for me to have in this idea given this new evidence?"
Eliezer uses "Traditional Rationality" to mean something like "Rationality, as practised by scientists everywhere, especially the ones who read Feynman and Popper". It refers to the rules that scientists follow.
A surely incomplete list of deficiencies:
In some ways, Eliezer is too hard on Traditional Rationalists (TRists). In the "wild and reckless youth" essay, which you cite, he focuses on how TR didn't keep him from privileging a hypothesis and wasting years of his life on it.
But TR, as represented by people like Sagan and Feynman, does enjoin you to believe things only on the basis of good evidence. Eliezer makes it sound like you can believe whatever crazy hypothesis you want, as long as it's naturalistic and in-principle-falsifiable, and as long as you don't expect others to be convinced until you deliver good evidence. But there are plenty of TRists who would say that you ought not to be convinced yourself until your evidence is strong.
However, Eliezer still makes a very good point. This injunction doesn't get you very far if you don't know the right way to evaluate evidence as "strong", or if you don't have a systematic method for synthesizing all the different evidences to arrive at your conclusion. This is where TR falls down. It gives you an injunction, but it leaves too much of the details of how to fulfill the injunction up to gut instinct. So, Eliezer will be contributing something very va...
I just started listening to THIS (perhaps 15min of it on my drive to work this morning), and EY has already mentioned a little about traditional rationality vs. where he is now with respect to reading Feynman. I'm not sure if he'll talk more about this, but Luke's page does have as a bullet point of the things covered:
Eliezer’s journey from ‘traditional rationality’ to ‘technical rationality’
so perhaps he'll continue in detail about this. Off hand, all I can specifically remember is that at one point he encountered some who thought that multiple routes...
One relevant attempt at a definition:
I will be using "extreme rationality" or "x-rationality" in the sense of "techniques and theories from Overcoming Bias, Less Wrong, or similar deliberate formal rationality study programs, above and beyond the standard level of rationality possessed by an intelligent science-literate person without formal rationalist training."
In one essay, Eliezer seems to be saying that Traditional Rationality was too concerned with process, whereas it should have been concerned with winning. In other passages, it seems that the missing ingredient in the traditional version was Bayesianism (a la Jaynes). Or sometimes, the missing ingredient seems to be an understanding of biases (a la Kahneman and Tversky).
All of those are problems with traditional rationality, and Elizeer has critiques traditional rationality for all of them. Traditional rationality should have helped Elizeer more than i...
It's not an agenda in the sense of a political agenda (though it does have some connections to political ideas), nor a conspiracy, nor a consciously intended and promoted agenda.
But, they have a bunch of unconscious ideas -- a particular worldview -- which informs how they approach their research and, because they do not use the rigor of science which prevents such things, their worldview/agenda biases all their results.
The proper rigor of science includes things like describing the experimental procedure in your paper so mistakes can be criticized and it can be repeated without introducing unintended changes, and having a "sources of error" section where you discuss all the ways your research might be wrong. When you leave out standard parts of science like those, and other more subtle ones, you get unscientific results. The scientific method, as Feynman explained, is our knowledge about how not to fool ourselves (i.e. it prevents our conclusions from being based on our biases). When you don't use it, you get wrong, useless and biased results by default.
One of the ways these paper goes wrong is it doesn't pay enough attention to the correct interpretation of the data. Even if the data was not itself biased -- which they openly admit it is -- their interpretation would be A) problematic and B) not argued for by the data itself (interpretations of data never are argued for by the data itself, but must be considered as a separate and philosophical issue!)
If you try enough, you can get people to make mistakes. I agree with that much. But what mistake are the people making? That's not obvious, but the authors don't seriously discuss the matter. For example, how much of the mistake people are making is due to miscommunication -- that they read the question they are asked as having a meaning a bit different than the literal meaning the researchers consider the one true meaning? The possibility that the entire phenomenon they were observing, or part of it, is an aspect of communication not biases about probability is simply not addressed. Many other issues of interpretation of the results aren't addressed either.
They simply interpret the experimental data in a way in line with their biases and unconscious agendas, and then claim that empirical science has supported their conclusions.
It's not an agenda in the sense of a political agenda (though it does have some connections to political ideas), nor a conspiracy, nor a consciously intended and promoted agenda.
But, they have a bunch of unconscious ideas -- a particular worldview -- which informs how they approach their research
Yes, I agree, and the ideas are not all unconscious either. What do you think the worldview is? I'm guessing the worldview has ideas in it like animals create knowledge, but not so much as people, and that nature (genes) influence human thought leading to biase...
In several places in the sequences, Eliezer writes condescendingly about "Traditional Rationality". The impression given is that Traditional Rationality was OK in its day, but that today we have better varieties of rationality available.
That is fine, except that it is unclear to me just what the traditional kind of rationality included, and it is also unclear just what it failed to include. In one essay, Eliezer seems to be saying that Traditional Rationality was too concerned with process, whereas it should have been concerned with winning. In other passages, it seems that the missing ingredient in the traditional version was Bayesianism (a la Jaynes). Or sometimes, the missing ingredient seems to be an understanding of biases (a la Kahneman and Tversky).
In this essay, Eliezer laments that being a traditional rationalist was not enough to keep him from devising a Mysterious Answer to a mysterious question. That puzzles me because I would have thought that traditional ideas from Peirce, Popper, and Korzybski would have been sufficient to avoid that error. So apparently I fail to understand either what a Mysterious Answer is or just how weak the traditional form of rationality actually is.
Can anyone help to clarify this? By "Traditional Rationality", does Eliezer mean to designate a particular collection of ideas, or does he use it more loosely to indicate any thinking that is not quite up to his level?