"In a sufficiently mad world, being sane is actually a disadvantage"
– Nick Bostrom
Followup to: What is rationality?
A canon of work on "rationality" has built up on Less Wrong; in What is rationality?, I listed most of the topics and paradigms that have been used extensively on Less Wrong, including: simple calculation and logic1, probability theory, cognitive biases, the theory of evolution, analytic philosophical thinking, microeconomics. I defined "Rationality" to be the ability to do well on hard decision problems, often abbreviated to "winning" - choosing actions that cause you to do very well.
However, I think that the rationality canon here on Less Wrong is not very good at causing the people who read it to actually do well at most of life's challenges. This is therefore a criticism of the LW canon.
If the standard to judge methods by is whether they give you the ability to do well on a wide range of hard real-life decision problems, with a wide range of terminal values being optimized for, then Less-Wrong-style rationality fails, because the people who read it seem to mostly only succeed at the goal that most others in society would label as "being a nerd".2 We don't seem to have a broad range of people pursuing and winning at a broad range of goals (though there are a few exceptional people here).
Although the equations of probability theory and expected utility do not state that you have to be a "Spock rationalist" to use them, in reality I see more Spock than Kirk. I myself am not exempt from this critique.
What, then, is missing?
The problem, I think, is that the original motivation for Less Wrong was the bad planning decisions that society as a whole takes3. When society acts, it tends to benefit most when it acts in what I would call the Planning model of winning, where reward is a function of the accuracy of beliefs and the efficacy of explicitly reasoned plans.
But individuals within a society do not get their rewards solely based upon the quality of their plans: we are systematically rewarded and punished by the environment around us by:
- Our personality traits and other psychological factors such as courage, happiness set-point, self-esteem, etc.
- The group we are a member of, especially our close friends and associates.
- Our skill in dealing with people, which we might call "emotional intelligence".
- The shibboleths we display, the signals we send out (especially signaling-related beliefs) and our overall style.
The Less Wrong canon therefore pushes people who read it to concentrate on mostly the wrong kind of thought processes. The "planning model" of winning is useful for thinking about what people call analytical skill, which is in turn useful for solitary challenges that involve a detailed mechanistic environment that you can manipulate. Games like Alpha Centauri and Civilization come to mind, as do computer programming, mathematics, science and some business problems.
Most of the goals that most people hold in life cannot be solved by this kind of analytic planning alone, but the ones that can (such as how to code, do math or physics) are heavily overrepresented on LW. The causality probably runs both ways: people whose main skills are analytic are attracted to LW because the existing discussion on LW is very focused on "nerdy" topics, and the kinds of posts that get written tend to focus on problems that fall into the planning model because that's what the posters like thinking about.
1: simple calculation and logic is not usually mentioned on LW, probably because most people here are sufficiently well educated that these skills are almost completely automatic for them. In effect, it is a solved problem for the LW community. But out in the wider world, the sanity waterline is much lower. Most people cannot avoid simple logical errors such as affirming the consequent, and cannot solve simple Fermi Problems.
2: I am not trying to cast judgment on the goal of being an intellectually focused, not-conventionally-socializing person: if that is what a person wants, then from their axiological point of view it is the best thing in the world.
3: Not paying any attention to futurist topics like cryonics or AI which matter a lot, making dumb decisions about how to allocate charity money, making relatively dumb decisions in matters of how to efficiently allocate resources to make the distribution of human experiences better overall.
I wouldn't want to join a community that did those things, or which uncritically praised a community that did. Still, I think that even if the seduction community were an undifferentiated mass of irrationality, it would be worth discussing here for the same reasons that we talk about religion and astrology.
Personally, when I see people being successful in a certain domain (or believing that they are successful), yet holding some obviously irrational beliefs, my interest is piqued. If these people are successful, is that despite their irrational beliefs, or could it be because of those beliefs? Could it be that some of the beliefs of PUAs work even though they are not true?
I don't understand why other rationalists wouldn't be wondering the same things, even when confronted with the negative aspects of pickup. As I've argued in the past here and here, pickup relates to many rationality topics:
Perhaps I've been committing the "typical mind fallacy" by assuming that just because these links between pickup and rationality are obvious to me, that they are also obvious to others.
We appear to have a topic that has a lot of connections to rationality, some of which have been discussed here with a lot of approval, judging by upvotes. There are also people who discuss this topic in a non-rigorous way that causes feelings of repugnance in many observers. In my view, the relevance of pickup to rationality and the philosophy of science is so great that we would be throwing the baby out with the bathwater to discourage discussion of the topic. The solution is to discuss this topic in a rigorous way, and the connections to rationality made clear. When the topic is discussed in a non-rigorous and repugnance-causing way, the appropriate recourse is the reply button and the downvote button.
(Building on this earlier comment of mine.)
I appreciate your list of connections between PUA and rationality, because it's gotten me closer to working out why I don't see PUA as having a special connection to rationality.
I think it's because I find the connections you suggest generic. Most of them, I reckon, would hold for any subculture with a sufficiently active truth-seeking element, such as (picking a few examples out of thin air, so they may not be good examples, but I hope they communicate my point) poker, art valuation, or trading card gaming. Thoug... (read more)