I don't think there's a single defining point of difference, but I tend to think of it as the difference between the traditional social standard of having beliefs you can defend and the stricter individual standard of trying to believe as accurately as possible.
The How to Have a Rational Discussion flowchart is a great example of the former: the question addressed there is whether you are playing by the rules of the game. If you are playing by the rules and can defend your beliefs, great, you're OK! This is how we are built to reason.
X-rationality emphasizes having accurate beliefs over having defensible beliefs. If you fail to achieve a correct answer, it is futile to protest that you acted with propriety. Instead of asking "does this evidence allow me to keep my belief or oblige me to give it up?", it asks "what is the correct level of confidence for me to have in this idea given this new evidence?"
Eliezer uses "Traditional Rationality" to mean something like "Rationality, as practised by scientists everywhere, especially the ones who read Feynman and Popper". It refers to the rules that scientists follow.
A surely incomplete list of deficiencies:
In some ways, Eliezer is too hard on Traditional Rationalists (TRists). In the "wild and reckless youth" essay, which you cite, he focuses on how TR didn't keep him from privileging a hypothesis and wasting years of his life on it.
But TR, as represented by people like Sagan and Feynman, does enjoin you to believe things only on the basis of good evidence. Eliezer makes it sound like you can believe whatever crazy hypothesis you want, as long as it's naturalistic and in-principle-falsifiable, and as long as you don't expect others to be convinced until you deliver good evidence. But there are plenty of TRists who would say that you ought not to be convinced yourself until your evidence is strong.
However, Eliezer still makes a very good point. This injunction doesn't get you very far if you don't know the right way to evaluate evidence as "strong", or if you don't have a systematic method for synthesizing all the different evidences to arrive at your conclusion. This is where TR falls down. It gives you an injunction, but it leaves too much of the details of how to fulfill the injunction up to gut instinct. So, Eliezer will be contributing something very va...
I just started listening to THIS (perhaps 15min of it on my drive to work this morning), and EY has already mentioned a little about traditional rationality vs. where he is now with respect to reading Feynman. I'm not sure if he'll talk more about this, but Luke's page does have as a bullet point of the things covered:
Eliezer’s journey from ‘traditional rationality’ to ‘technical rationality’
so perhaps he'll continue in detail about this. Off hand, all I can specifically remember is that at one point he encountered some who thought that multiple routes...
One relevant attempt at a definition:
I will be using "extreme rationality" or "x-rationality" in the sense of "techniques and theories from Overcoming Bias, Less Wrong, or similar deliberate formal rationality study programs, above and beyond the standard level of rationality possessed by an intelligent science-literate person without formal rationalist training."
In one essay, Eliezer seems to be saying that Traditional Rationality was too concerned with process, whereas it should have been concerned with winning. In other passages, it seems that the missing ingredient in the traditional version was Bayesianism (a la Jaynes). Or sometimes, the missing ingredient seems to be an understanding of biases (a la Kahneman and Tversky).
All of those are problems with traditional rationality, and Elizeer has critiques traditional rationality for all of them. Traditional rationality should have helped Elizeer more than i...
There a variety of issues going on here. Manfred pointed out many of them. There's another issue here that is you've had an influx of users all of whom are arguing for essentially the same set of positions and not doing it very well with a bit of rudeness thrown in. One of the three is being particularly egregious, and I suspect that there may be some spill-over in attitude from that user's behavior towards how people are voting about you. I will note that in the threads responding to the various Popperian criticisms, various LW regulars are willing to say when another LWian has said something they think is wrong. It might help to distinguish yourselves if you were willing to point out when you think the others are wrong. For example, you haven't posted at all in this thread. Do you agree with everything he has said there? If you disagree will you say so or do you feel a need to stay silent to protect a fellow member of your tribal group?
For what it is worth, I'm not a Bayesian. I think that Bayesianism has deep problems especially surrounding 1) the difficulty of where priors come from 2) the difficulty of meaningfully making Bayesian estimates about abstract systems. I've voiced those concerns before here, and many of those comments have been voted up. Indeed, I recently started a subthread discussing a problem with the Solomonoff prior approach which has been voted up.
I agree with curi that the Conjunction Fallacy does not exist. But if I disagreed I would say so - Popperians don't hold back from criticism of each other. If my criticism hit its mark, then curi would change his mind and I know that because I participate in Popperian forums that curi participates in. That said, most Popperians I know think along similar lines; I see more disagreement among Bayesians about their philosophy here.
Your thread is about a technical issue and I think Bayesians are more comfortable discussing these sort of things.
In several places in the sequences, Eliezer writes condescendingly about "Traditional Rationality". The impression given is that Traditional Rationality was OK in its day, but that today we have better varieties of rationality available.
That is fine, except that it is unclear to me just what the traditional kind of rationality included, and it is also unclear just what it failed to include. In one essay, Eliezer seems to be saying that Traditional Rationality was too concerned with process, whereas it should have been concerned with winning. In other passages, it seems that the missing ingredient in the traditional version was Bayesianism (a la Jaynes). Or sometimes, the missing ingredient seems to be an understanding of biases (a la Kahneman and Tversky).
In this essay, Eliezer laments that being a traditional rationalist was not enough to keep him from devising a Mysterious Answer to a mysterious question. That puzzles me because I would have thought that traditional ideas from Peirce, Popper, and Korzybski would have been sufficient to avoid that error. So apparently I fail to understand either what a Mysterious Answer is or just how weak the traditional form of rationality actually is.
Can anyone help to clarify this? By "Traditional Rationality", does Eliezer mean to designate a particular collection of ideas, or does he use it more loosely to indicate any thinking that is not quite up to his level?