Applause lights. You should really read the sequences.
It took me a moment to understand that you were creating a parody. I'm not sure if that moment was indicative of EHeller, in fact, being on to something.
Anyway, on the original comment - yes, there was a little bit of tu quoque involved. How could I not? It was just too deliciously ironic. Even when accusing someone else of failing to formalize and test their ideas, it's easy to fail at formalizing and testing ideas. It's not meant (entirely) as a tu quoque - just as a warning that it really is easy to fall for that, that even consciously thinking testability isn't enough to actually get people to make explicit predictions. So, I decided to spend a few seconds actually trying to dissect the claim, and ask what sort of testable predictions we can derive from the "There also appears to be an unspoken contempt for creating novel work."*
The obvious signs would be a significant number downvotes on posts that deal with original work, or disparaging statements toward anyone presenting work for the first time on LW. Undue or unreasonable skepticism toward novel claims, perhaps, above and beyond what is warranted by their novelty. I have no idea how to formalize this - and, in fact, the more I look at the statement, the more am convinced it really is vague and untestable. I dunno - anyone else want to take a crack at it? EHeller, do you have something more precise that you were trying to get at?
It was an interesting exercise - even if it turns out to be less meaningful or reducible than I thought, it's good exercising in noticing so.
*Of course, there are good reasons why one might not want to spend time and effort trying to formalize and test an idea. The statement "Lots of conjecture that such-and-such behavior may be signaling, and such-and-such belief is a result of such-and-such bias, with little discussion of how to formalize and test the idea" isn't as interesting not only because it's imprecise, but also because it does in fact take effort and energy to formalize and test an idea - it's not always worth it to test every idea; the entire point of having general concepts about biases is that you can quickly identify problems without having to spend time and energy trying to do the math by hand.
The obvious signs would be a significant number downvotes on posts that deal with original work, or disparaging statements toward anyone presenting work for the first time on LW. Undue or unreasonable skepticism toward novel claims, perhaps, above and beyond what is warranted by their novelty.
I rather doubt that. It's probably just you confusing the map for the territory.
Okay, okay, I'm done. >>
I've spent so much time in the cogsci literature that I know the LW approach to rationality is basically the mainstream cogsci approach to rationality (plus some extra stuff about, e.g., language), but... do other people not know this? Do people one step removed from LessWrong — say, in the 'atheist' and 'skeptic' communities — not know this? If this is causing credibility problems in our broader community, it'd be relatively easy to show people that Less Wrong is not, in fact, a "fringe" approach to rationality.
For example, here's Oaksford & Chater in the second chapter to the (excellent) new Oxford Handbook of Thinking and Reasoning, the one on normative systems of rationality:
Is it meaningful to attempt to develop a general theory of rationality at all? We might tentatively suggest that it is a prima facie sign of irrationality to believe in alien abduction, or to will a sports team to win in order to increase their chance of victory. But these views or actions might be entirely rational, given suitably nonstandard background beliefs about other alien activity and the general efficacy of psychic powers. Irrationality may, though, be ascribed if there is a clash between a particular belief or behavior and such background assumptions. Thus, a thorough-going physicalist may, perhaps, be accused of irrationality if she simultaneously believes in psychic powers. A theory of rationality cannot, therefore, be viewed as clarifying either what people should believe or how people should act—but it can determine whether beliefs and behaviors are compatible. Similarly, a theory of rational choice cannot determine whether it is rational to smoke or to exercise daily; but it might clarify whether a particular choice is compatible with other beliefs and choices.
From this viewpoint, normative theories can be viewed as clarifying conditions of consistency… Logic can be viewed as studying the notion of consistency over beliefs. Probability… studies consistency over degrees of belief. Rational choice theory studies the consistency of beliefs and values with choices.
They go on to clarify that by probability they mean Bayesian probability theory, and by rational choice theory they mean Bayesian decision theory. You'll get the same account in the textbooks on the cogsci of rationality, e.g. Thinking and Deciding or Rational Choice in an Uncertain World.