Less Wrong is a community blog devoted to refining the art of human rationality. Please visit our About page for more information.

Will_Newsome comments on Scope Insensitivity - Less Wrong

46 Post author: Eliezer_Yudkowsky 14 May 2007 02:53AM

You are viewing a comment permalink. View the original post to see all comments and the full post content.

Comments (44)

Sort By: Old

You are viewing a single comment's thread. Show more comments above.

Comment author: Will_Newsome 18 November 2011 12:22:45PM 6 points [-]

So the thing that vegetarians aren't thinking about strengthens their argument.

This is only somewhat related, as it is less true of overtly political domains, but I am confused by the frequency with which seemingly reasonable methods support naively counter-intuitive conclusions against naively intuitive conclusions where ultimately the naively intuitive conclusions win, i.e. where bullet biting loses to traditionalism. E.g. mathematical or statistical arguments, even solid-seeming ones, often lose in practice due to leaving out important considerations which the brain's automatic algorithms don't miss.

Ironically this is especially true in the heuristic and biases literature where normative math is often misunderstood and experimental results are often misinterpreted. The weakness of the findings in the heuristics and biases literature undermines the most commonly cited support of the "the world is mad" hypothesis and so there is a lack of alternative wide-scale explanations for any perceived wide-spread irrationality. Lack of incentives for "rationality" in various domains remains a blanket explanation but it can explain almost anything and is perhaps unjustifiably hinged on a notion of rationality that might or might not be well-supported. In general any behavior can be explained away as a response to a set of incentives that does not include objective truth.

If conclusions reached via common human intuitions or epistemic practices are generally more valid than is suggested by their cited supporting arguments, and if uncommon epistemic practices often lead to conclusions that are less valid than those practices seem to suggest, then it may be wise for those who utilize uncommon epistemic practices to be relatively more wary of their uncommon conclusions and relatively more curious about possible explanations of common conclusions than they otherwise would have been. Scientism/falsificationism, Bayesianism, skepticism, and similar philosophically-inspired memeplexes are examples of sources of uncommon epistemic practices.