I've spent so much time in the cogsci literature that I know the LW approach to rationality is basically the mainstream cogsci approach to rationality (plus some extra stuff about, e.g., language), but... do other people not know this? Do people one step removed from LessWrong — say, in the 'atheist' and 'skeptic' communities — not know this? If this is causing credibility problems in our broader community, it'd be relatively easy to show people that Less Wrong is not, in fact, a "fringe" approach to rationality.
For example, here's Oaksford & Chater in the second chapter to the (excellent) new Oxford Handbook of Thinking and Reasoning, the one on normative systems of rationality:
Is it meaningful to attempt to develop a general theory of rationality at all? We might tentatively suggest that it is a prima facie sign of irrationality to believe in alien abduction, or to will a sports team to win in order to increase their chance of victory. But these views or actions might be entirely rational, given suitably nonstandard background beliefs about other alien activity and the general efficacy of psychic powers. Irrationality may, though, be ascribed if there is a clash between a particular belief or behavior and such background assumptions. Thus, a thorough-going physicalist may, perhaps, be accused of irrationality if she simultaneously believes in psychic powers. A theory of rationality cannot, therefore, be viewed as clarifying either what people should believe or how people should act—but it can determine whether beliefs and behaviors are compatible. Similarly, a theory of rational choice cannot determine whether it is rational to smoke or to exercise daily; but it might clarify whether a particular choice is compatible with other beliefs and choices.
From this viewpoint, normative theories can be viewed as clarifying conditions of consistency… Logic can be viewed as studying the notion of consistency over beliefs. Probability… studies consistency over degrees of belief. Rational choice theory studies the consistency of beliefs and values with choices.
They go on to clarify that by probability they mean Bayesian probability theory, and by rational choice theory they mean Bayesian decision theory. You'll get the same account in the textbooks on the cogsci of rationality, e.g. Thinking and Deciding or Rational Choice in an Uncertain World.
Pinker seems to be missing the same major point that Gigerenzer et al. continuously miss, a point made by those in the heuristics and biases tradition from the beginning (e.g. Baron 1985): the distinction between normative, descriptive, and prescriptive rationality. In a paper I'm developing, I explain:
Or, here is Baron (2008):
What does mainstream academic prescriptive rationality look like? I get the sense that's where Eliezer invented a lot of his own stuff, because "mainstream cogsci" hasn't done much prescriptive work yet.