There's too much individualism in the current LessWrong rationality. I remember a folk tale I read, describing the adventures of two individuals named something like Solves-Problems-By-Himself and Asks-Others-For-Help. Given the task of preserving meat from rotting, the former shielded the meat from the sun with large leaves and dripped water on it. The latter gave away the meat in exchange for an identical piece delivered at the end of the contest.
It was sort of cultural-shock jarring to me when I read it, because "obviously" producing the "identical" piece shouldn't be counted as having preserved the original. But we have too many lone-hero-genius stories, and not enough "so-and-so was stumped so he asked his sister" sort of stories.
Afaik, LW is a spin-off from work on FAI. Since FAI needs to be gotten right the first time, it isn't surprising if LW is oriented towards planning.
One thing I haven't seen discussed is the process of generating new ideas. I was thinking about this as a result of the lurkers thread-- some people said they didn't post because anything they thought of had already been said.
I'm a lurker, and have read more posts than comments, but it seems that there actually is attention paid here to how to increase one's emotional intelligence. How to influence one's own emotions, how to help emotions correspond to reality, how to forget them when it's necessary. The same issue which (to my limited knowledge) Roman philosophers were very concerned with.
Maybe it's not explicitly applied, in a "How to Win Friends and Influence People" style, but I think the population here (myself included) tends to be turned off by that style. This blog's style is the type of style someone like me can understand and apply. But I think we need to distinguish between style (which is optimized for nerds) and substance (which is quite human and universal, as I understand, and not at all confined to making "society" better at the expense of the challenges of one's own life.)
In other words, if LW isn't helping me win, then it's I who am doing something wrong.
As I wrote earlier on LW:
If you ask me, the term "instrumental rationality" has been subject to inflation. It's not supposed to mean better achieving your goals, it's supposed to mean better achieving your goals by improving your decision algorithm itself, as opposed to by improving the knowledge, intelligence, skills, possessions, and other inputs that your decision algorithm works from. Where to draw the line is a matter of judgment but not therefore meaningless.
Skills other than rationality matter a lot, and a rational person will seek to learn those skills (to the extent that they're sufficiently useful/easy), and it isn't implausible that those skills should be discussed on LW, but that doesn't mean there's something wrong with our conception of rationality.
ETA: I guess you could argue that there's different skills involved in being rational about nerd topics and being rational about non-nerd topics, and we haven't focused enough on the latter.
The group we are a member of, especially our close friends and associates.
I think who you know is probably the most important element of social and financial success. To win more rationalists need to help each other along, for example, by hiring and mentoring each other.
Seriously though, people always underrate how important this is.
I think you make some great points in this post:
Our skill in dealing with people, which we might call "emotional intelligence".
There used to be more activity on this regard with a lot of people writing about the pick up community and pick up artists. Unfortunately there were complaints about women being objectified and after that you didn't read much about this topic anymore.
Another thing is that I have the impression that LW is becoming more and more about signaling(I got this from Robin Hanson's writings) rationality as opposed to actually ...
Another thing is that I have the impression that LW is becoming more and more about signaling(I got this from Robin Hanson's writings) rationality as opposed to actually working on it. That is, what counts is making an elaborate post with sophisticated reasoning in order to impress, regardless whether it can or will be actually implemented. Maybe useful if you are bulding a GAI, not so much if you want to improve yourself.
Not more and more. You're just becoming more socially aware. You've taken the Red Pill, now the trick is to learn to live with what you see, without becoming embittered. Because, as the name suggests, LessWrong has slightly less signalling bullshit relative to information than average for humans. That's the best you can expect, now make the most of it.
I stopped reading his stuff when I realized it was having a negative effect on how I think of women, sexuality, and my own sexual identity. (I am a hetero male).
I'm not sure that higher status (as claimed in the OP) is quite the right way to phrase it but it seems to me that women are treated differently and I am also aware that I do it myself. Some observations:
There's probably others but I hope this gives you some things to consider.
I wouldn't want to join a community of people, mostly men, describing Women as an undifferentiated mass distinguished by an array of mental flaws to be exploited for personal gain - that's a sign of some hardcore irrationality, to make claims which are that self-aggrandizing and that easily refuted.
I wouldn't want to join a community that did those things, or which uncritically praised a community that did. Still, I think that even if the seduction community were an undifferentiated mass of irrationality, it would be worth discussing here for the same reasons that we talk about religion and astrology.
Personally, when I see people being successful in a certain domain (or believing that they are successful), yet holding some obviously irrational beliefs, my interest is piqued. If these people are successful, is that despite their irrational beliefs, or could it be because of those beliefs? Could it be that some of the beliefs of PUAs work even though they are not true?
I don't understand why other rationalists wouldn't be wondering the same things, even when confronted with the negative aspects of pickup. As I've argued in the past here and here, pickup relates to many rationalit...
Would these subjects not interest you, or is your worry that discussion of them would get too far off-topic to a degree that is bad?
The second I think. (I feel about the same for topics in which I have shown interest, so it's not about my level of interest.)
If I wanted to force a conversation about a particular subculture or hot-button topic not obviously related to rationality, and I were called out on it, I could probably contrive a defensible list of ways my desired subject relates to rationality. For example, I took your list of bullet points for PUA and adapted most of them to race and IQ (a subject I'm more familiar with):
It's just hard. Which is why it's usually a bad idea to go there.
Agree with first quoted sentence. Disagree with second one.
In my view, LessWrong should be a place where we rationally attempt to discuss subjects that would be too controversial to discuss anywhere else. On LessWrong, we can hold arguments in such discussions to higher standards of scrutiny than anywhere else.
I don't agree with the "it's hard, so we should give up" approach to discussing controversial subjects on LessWrong. Controversial, mind-killing subjects are exactly where rationalist scrutiny is most needed.
I broadly agree with the feminist project and think they have done more good than harm. I also have the following criticisms
Feminists too often mistake the complex, dynamic and context-dependent way status/power actually works for an oversimplified "patriarchy" where men as a class oppress women as a class.
This means feminism is much more sensitive to sexism against women and will routinely miss or play down sexism against men. This wouldn't be a problem except that feminism has sort of universalist aspirations; they're often more like a special interest group.
Feminism sometimes advocates taking political roles that can be oppressive, in much the same way gender roles can. This is partly why the movement has had trouble embracing transgendered people, BDSM, porn stars, sex workers etc. (And why the views of so-called 'radical' feminists still can't accept these groups)
Despite talking a lot about intersectionality, the core feminist institutions are more like a voice for Western, white, upper middle class women than for women as a whole. (A criticism I feel kind of like a dick making as I am all of those things+ a man, but it's true).
The movement isn't a good p
I agree that we can do better.
I have a thought on these studies that give evidence for unequal intelligence between the sexes (or races.) They can have very scary, emotional connotations. They used to scare me. Then I thought about it a bit and asked "What am I scared of?" And I realized that I was scared that, if these genetic inequalities were real, I'd have to be a sexist or racist.
But think for a minute. Suppose the "worst-case scenario" were true. Suppose women really did have worse brains than men, for genetic reasons. What would be my logical response?
It occurred to me that the only responsible way to react to such news would be to treat it as a disease to be cured. And then start working on biology to fix it. I am not an anti-Semite because I'm aware that Tay-Sachs disease affects Ashkenazic Jews.
If there were genetic differences between sexes or races, I'd be less likely to favor affirmative action at the college or employment level, because it wouldn't be effective. The injustice would be biological, not social, and it would be best fixed biologically.
The real reason people are scared of genetic differences is the naturalistic fallacy. Jus...
I agree of course that females should be welcome and treated with respect,
This was the issue. The way PUA was being discussed made some women here feel unwelcome and disrespected.
If the site's purpose is rationality should it matter if there is a majority of males?
Of course not. No one expects there to ever be anything but a majority of males. But the community would be better off if the ratio wasn't as skewed as it is. Some reasons:
Gender diversity means experience diversity and neuro-diversity, these things let us catch blind spots. The fact that we are men means there will be experiences we aren't aware of and it is helpful to have people with those experiences around to fill in the gaps. This of course goes for all kinds of socially significant diversity.
Women, on average, appear to be less confrontational and aggressive in their discussions here (I don't know if this is learned or innate). People with such demeanors are good to have around as the rest of us appear to get our egos caught up in arguments a lot.
One ostensible goal of this site is to help spread rationality. Alienating large segments of the the potential convert pool is a bad idea.
The general consen
But at the same time I wonder why you would be so opposed to it?
Because politics is the mind-killer and almost every single conversation about pickup artistry immediately becomes infested with politically-charged claims. It's like that discussion about the correlation between race and intelligence that went to hell in a handbasket not long ago. If there be mines, don't go for a walk.
A good example is negotiation skills - I can't (offhand) recall a post discussing those directly.
Negotiation is generally regarded as one of the "soft" skills, and so often disregarded by thinkers of a more analytical stripe - yet we live in a world where negotiating with others who may not be as rational as you are can be a very fruitful way of advancing your personal goals.
When society acts, it tends to benefit most when it acts in what I would call the Planning model of winning, where reward is a function of the accuracy of beliefs and the efficacy of explicitly reasoned plans
I'm not sure I agree with this. In fact I'm not quite sure what it means altogether. (What would I believe if I did in fact disagree with it?)
Could you try and clarify the contrast you're drawing here?
Spending lots of time thinking about concepts like cryonics, the Great Filter, the self-indication assumption, omega, etc. does not lead directly to traditionally desirable life outcomes.
If we wanted to be more traditionally successful, we would have more posts on what could be termed "quotidian" rationality, topics like investing, career planning, fitness, fashion, relationships and so on. But there are many other sites/magazines/books about that stuff; it's unclear how the rationalist viewpoint could help figure out a better (for example) diet...
I definitely agree with this. A lot of my criticisms of the general kinds of things that get discussed around here (which I used to voice more often on OB) disappeared when I saw the sorts of problems they were being applied to. The "planning model of rationality" works remarkably well when applied to problems where you get to plan. My initial criticism of Bayesian methods for making decisions under uncertainty was that it doesn't work very well for most of our decisions, things like "Which path should I take across the room to retrieve m...
I think discussion of talent is generally lacking from rationality. Some clearly very irrational people are extremely successful. Sometimes it is due to luck, but even then it is usually the case that a large amount of talent was necessary to enter the lottery. With my particular combination of talents, no amount of learning the arts of rationality is going to turn me into a golfer like Tiger Woods or a media mogul like Rupert Murdoch.
The closest Roko's list comes to this sort of thing is microeconomics, which includes comparative advantage. Taking ...
Rational implementation is what we need more of. I wouldn't say planning is the "wrong kind of thought process". I'd say we have an abundance of planning tactics and a shortage of implementation tactics. Once you decide how to deal with your in-laws, how do you stay cool enough to actually do it?
The numerous posts on Akrasia are a big step in the implementation direction, though. We could use more vivid classifications of implementation problems like that, and techniques to deal with them.
"In a sufficiently mad world, being sane is actually a disadvantage"
To be sane with the usual limitation of a person is not enough. But to have a much saner civilization against some mad civilization, is a big advantage - per se. Guess who will likely win in a clash!
A sufficiently sane transhuman could deal with a mad civilization. The power is the sanity accumulated. Better, the rationality accumulated.
When reading the title my response was "Nothing, but there are all sorts of potential problems in the stuff that you are implicitly adding to it". The use of the term here is as a symbol representing a bunch of cultural mores and attitudes that are distinct from what is contained in a definition of the world. If you must use 'rationality' to describe the problems you mention here then at least give it a capital 'R'. Much like of the two most significant political parties in Australia the conservative of the two is the 'Liberal' party and 'Freedom Fighters' do all sorts of things not necessarily optimised for furthering freedom.
"In a sufficiently mad world, being sane is actually a disadvantage"
– Nick Bostrom
Followup to: What is rationality?
A canon of work on "rationality" has built up on Less Wrong; in What is rationality?, I listed most of the topics and paradigms that have been used extensively on Less Wrong, including: simple calculation and logic1, probability theory, cognitive biases, the theory of evolution, analytic philosophical thinking, microeconomics. I defined "Rationality" to be the ability to do well on hard decision problems, often abbreviated to "winning" - choosing actions that cause you to do very well.
However, I think that the rationality canon here on Less Wrong is not very good at causing the people who read it to actually do well at most of life's challenges. This is therefore a criticism of the LW canon.
If the standard to judge methods by is whether they give you the ability to do well on a wide range of hard real-life decision problems, with a wide range of terminal values being optimized for, then Less-Wrong-style rationality fails, because the people who read it seem to mostly only succeed at the goal that most others in society would label as "being a nerd".2 We don't seem to have a broad range of people pursuing and winning at a broad range of goals (though there are a few exceptional people here).
Although the equations of probability theory and expected utility do not state that you have to be a "Spock rationalist" to use them, in reality I see more Spock than Kirk. I myself am not exempt from this critique.
What, then, is missing?
The problem, I think, is that the original motivation for Less Wrong was the bad planning decisions that society as a whole takes3. When society acts, it tends to benefit most when it acts in what I would call the Planning model of winning, where reward is a function of the accuracy of beliefs and the efficacy of explicitly reasoned plans.
But individuals within a society do not get their rewards solely based upon the quality of their plans: we are systematically rewarded and punished by the environment around us by:
The Less Wrong canon therefore pushes people who read it to concentrate on mostly the wrong kind of thought processes. The "planning model" of winning is useful for thinking about what people call analytical skill, which is in turn useful for solitary challenges that involve a detailed mechanistic environment that you can manipulate. Games like Alpha Centauri and Civilization come to mind, as do computer programming, mathematics, science and some business problems.
Most of the goals that most people hold in life cannot be solved by this kind of analytic planning alone, but the ones that can (such as how to code, do math or physics) are heavily overrepresented on LW. The causality probably runs both ways: people whose main skills are analytic are attracted to LW because the existing discussion on LW is very focused on "nerdy" topics, and the kinds of posts that get written tend to focus on problems that fall into the planning model because that's what the posters like thinking about.
1: simple calculation and logic is not usually mentioned on LW, probably because most people here are sufficiently well educated that these skills are almost completely automatic for them. In effect, it is a solved problem for the LW community. But out in the wider world, the sanity waterline is much lower. Most people cannot avoid simple logical errors such as affirming the consequent, and cannot solve simple Fermi Problems.
2: I am not trying to cast judgment on the goal of being an intellectually focused, not-conventionally-socializing person: if that is what a person wants, then from their axiological point of view it is the best thing in the world.
3: Not paying any attention to futurist topics like cryonics or AI which matter a lot, making dumb decisions about how to allocate charity money, making relatively dumb decisions in matters of how to efficiently allocate resources to make the distribution of human experiences better overall.