FormallyknownasRoko comments on Rationality is Not an Attractive Tribe - Less Wrong
You are viewing a comment permalink. View the original post to see all comments and the full post content.
You are viewing a comment permalink. View the original post to see all comments and the full post content.
Comments (105)
I'm not sure I like the dichotomy between "the common man" and "us." (Surely some people on LW are "common" with reference to their income or their education level.)
The thing is, scientific knowledge, at least, used to be a lot more widespread. I once found my grandmother's 8th grade science workbook (she was a child in the 1930's) and it was shockingly advanced! And very practical: they were diagramming wells and septic tanks. And we're not talking about a child of privilege here. She was a small-town girl, the daughter of a seamstress. In those days the Army didn't have to contract so many engineering firms because ordinary soldiers, if they'd been to high school, knew a little engineering already.
Honestly I think science vs. religion, rationality as an ideology, is no way to communicate with people who aren't already attracted to that way of thinking. But you can educate people so that they grow up doing things where rationality is important. Math, science, engineering, and to some extent argumentative writing and speaking. If you know how to do that at a 1930s-era level, then in some sense it doesn't matter if your church teaches fundamentalist religion: you'll compartmentalize it, say the words dutifully, and recognize that you can't expect day-to-day life to look like that. (As opposed to making actual financial decisions on the assumption that the Rapture will come within ten years. Yes, people really do that. They should stop.)
There's no real barrier to entry into thinking and behaving sensibly. You can do that no matter what your IQ is. And science is a lot more accessible than we assume today (I think poor education quality and low work ethic is more of a barrier than IQ.)
Be careful to separate one's desire to signal how politically-correct one is from genuine epistemological issues, like how to carve reality at its natural joints. Even if some people here are of average IQ, it may still be natural to identify "LWer" with "smart", in the same sense one identifies "bird" with "fly".
When you are trying to have a discussion whose output is supposed to be a plan of action (or to contribute to such), it is silly to input signalling-motivated statements into the discussion; the result of planning using statements which were emitted for signalling purposes is that the plan is likely to fail.
Ok, come to think of it, LWers have a higher IQ than the general population. That's kind of not in question.
But this article was basically saying that rationalism is unpopular because there's an IQ barrier. Most people with high IQ's aren't rational either, though. You can't say "we have a rationalist worldview because we're smart* if most smart people don't.
On political correctness -- it's relevant in almost every discussion. I'd only cut it out entirely in cases when something serious was staked on my clear judgment and honest communication. If you're on the crew of a sinking ship, and you think you know how to save it ... then yeah, you'd better cut the crap. But almost every other conversation also has a social purpose. Signaling isn't something you just turn off except on the rare occasions you meet a mundane.
So by default, talk on LW should include some fraction of statements whose purpose is to signal political correctness ? It seems to me that PC is a great way to make yourself irrational and horribly biased. Good for social signalling, useless for the martial art of rationality. Kind of like coming to the Dojo wearing high heels.
Some people here would say we are on a sinking ship (spaceship earth) and that this discussion is explicitly about how to save the ship by making the crew more sane. ;-0 But that may be stretching the metaphor.
Well, put it this way: I think that saying "the reason most people don't agree with us is that they're just not smart enough" is a bit of a jerk thing to say. It doesn't represent us well and it's not nice. If you're absolutely convinced this is true, (and I think there's not enough evidence for that), you should be much more circumspect in how you say it. Yes, I want to be pro-nice and anti-jerk.
LW does not behave like the crew of a sinking ship. As a matter of observation, it just doesn't function that way. It's partly a social or discussion forum.
Re. "the reason most people don't agree with us is that they're just not smart enough"...totally aside from the question of whether this sort of sentiment is liable to be offputting to a lot of people, I've very often wondered whether anyone who holds such a sentiment is at all worried about the consequences of an "Emperor's New Clothes" effect.
What I mean by "Emperor's New Clothes" effect is that, regardless of what a person's actual views are on a given subject (or set of subjects), there's really nothing stopping said person from just copying the favored vocabulary, language patterns, stated opinions, etc., of those they see as the cleverest/most prominent/most respectable members of a community they want to join and be accepted in.
E.g., in self-described "rationalist" communities, I've noted that lots of people involved (a) value intelligence (however they define it) highly, and (b) appear to enjoy being acknowledged as clever themselves. The easiest way to do this, of course, is to parrot others that the community of interest clearly thinks are the Smartest of the Smart. And in some situations I suspect the "parroting" can occur involuntarily, just as a result of reading a lot of the writing of someone you like, admire, or respect intellectually, even if you may not have any real, deep understanding of what you are saying.
So my question is...does anyone even care about this possibility? Or are "communities" largely in the business of collecting members and advocates who can talk the talk, regardless of what their brains are actually doing behind the scenes?
I suspect answers vary.
For my own part: if hordes of people who aren't really rationalists start adopting, for purely signaling reasons, the trappings of epistemic hygiene... if they start providing arguments in defense of their positions and admitting when those arguments are shown to be wrong, for example, not because of a genuine desire for the truth but merely because of a desire to be seen that way... if they start reliably articulating their biases and identifying the operation of biases in others, merely because that's the social norm... if they start tagging their assertions with confidence indicators and using those indicators consistently without actually having a deep-rooted commitment to avoiding implicitly overstating or understating their confidence... and so on and so forth...
...well, actually, I'd pretty much call that an unadulterated win. Sign me up for that future, please.
OTOH, if hordes of people just start talking about how smart and rational they are and how that makes them better than ordinary people, well, that's not worth much to me.
Why do I have to be "absolutely convinced"? Can't I give it a 90% credence and still say it? Or are we playing anti-epistemology and applying higher standards of evidence to statements that we find emotionally uncomfortable? Geez I leave for a few months and come back to find TEXTBOOK EXAMPLES of irrationality passing for LW debate! ;-)
Ok, at this point I say "oops."
I basically posted without thinking too carefully; I knew I didn't like something about Alexandros' post. (Now that he's rephrased it it's starting to sound more plausible.) And I'm sorry if I came on too strong or insulted him personally.
Straight-up rational epistemology, where you're only concerned with truth, is ... a little unnatural to me, I have to admit. Doing it all the time (as opposed to just on special occasions when you heroically overcome your bias) would be a very different life.
Upvoted for honesty (we need more of this kind of thing I think)
I think you have a good point about image of LW -- we must be careful to present ourselves, that is I think something I tend to forget, there is, in fact a need for good image, as well as for good rationality
I think at this point I should clarify that the article didn't (intend to) say "the reason most people don't agree with us is that they're just not smart enough" but rather that for people without the capability to contribute to cutting edge maths, science, AI research and the like, our worldview is not that exciting and may therefore refuse to be convinced. Notice that it implies possible irrationality for many of the people who have joined thus far, as they commonly belong to the classes that our worldview values to a large extent. This includes myself.
Is it a nice thing to say? I personally do not feel comfortable bringing this stuff up and would prefer if things were differently. Perhaps the fact that I feel this way made this thought stand out as more urgent to discuss than many others that I have not bothered to post. In any case, this is the reality I perceive, and I've tried to be as inoffensive as possible while at the same time phrasing a coherent point. If anyone else is capable of expressing the core of this message in a less divisive way, they're welcome to do so.
ok, clarified it makes more sense. I just extracted the wrong main point.
It seems to me that whether LW should include signaling statements or not, it clearly does. Changing that will require some pretty heroic efforts.
But I've only been here a few weeks. You've been around and actively engaged for a while (albeit on what seems like it was an abrupt hiatus), so I'm interested in your judgment here.
To try and get a little more concrete: suppose you had to classify all the posts and comments on LW as +/- MAR (useful/useless for the martial art of rationality) and +/- SSA (useful/useless to signal social affiliations... including but not limited to "PC" in the mainstream sense).
What ratio of +MAR:-MAR do you think you'd end up with? What ratio of +SSA:-SSA? What ratio of +MAR:+SSA?
Are there easily identifiable subsets of posts for which you think you'd end up with ratios importantly different than those? (For example, posts by particular authors, top-level posts vs. comments, or whatever.)
What ratios would you expect from a community that was actively trying to "save a sinking ship"?
To be more concrete about value for "Martial Art of Rationality" I think one would need a scalar measure, rather than a Yes/No. If you chose a Yes/No measure, your results would be very much dependent on where you drew the line. To answer the question you are asking in terms of ratios of +MAR:+SSA is really to talk about the distribution of posts in terms of rationality versus signalling.
I think that LW contains a few very very good posters, a cadre of very good posters, a few oddballs and a sea of people who sorta-kinda understand some stuff. And in many cases, it does contain posts and comments which shun rationality in order to thump the table in favour of some particular ideology or just general political correctness, which I have run foul of once or twice.
The problem, as I see it, is that if ideology gets to rule the roost on a whole host of important topics, then for those topics, LW becomes just another irrational, tribal internet echo chamber, where affirming the great chosen ideology becomes the most important task. However, I have been convinced that the benefit of having LW existing AT ALL outweighs this cost. To be plain, I think that the LW consensus is actually (factually) wrong about many things that real people deal with in the real world, but this is outweighed by the fact that LW is the only place in the world concerned with rationality.
Voted up for citing posts rather than posters for lack of rationality .
I think I've seen some table-thumping for political incorrectness as well as for political correctness.
So what you are saying here is that ideology is always irrational? Since this is a community blog devoted to refining rationality, why don't you address the particular points you believe do constitute the irrational consensus? Or are you saying that some individuals here know that their ideology is irrational yet shun rationality in favor of it? That could be better phrased as there are people here who follow selfish goals and argue based on matters of taste. But how do you know that those people are aware of it, that they do not honestly believe that their disguised ideology is actually rationality?
I would love to know which kind of posts and comments, and in particular what consensus, you are referring to. This is very important to me, so if you don't want to make it public I would like you to send me a short private message.
AND
are examples of the kind of thing that I would regard as the problem. Other examples are even more inflammatory, but basically the all boil down to:
Person1: X is a true fact about the world
Person2: But saying X is mean to {political correctness brownie points group Y}, and besides, you can't be absolutely sure it's true/I won't believe it until you provide an impossibly high degree of evidence/we should stop talking about it or people will think we are mean!
The result is typically that LW can recite lots of rationalist principles, but when it comes to applying them to a significant number of real-world problems, LW is clueless.
Now one might reasonably argue that we don't need to be right about everything. Sure, LW is mired in PC BS about X,Y and Z, but topics A,B,C,D,E, ... which are also important are not subject to PC irrationality pressure. To an extent I buy this argument. However, reality is not a disconnected series of isolated topics: if you're wrong about X,Y and Z you might make incorrect inferences about all sorts of other things.
For some of us, being perceived as nice is one of the most important ways we can help ourselves in real life. And the best way to be consistently perceived as nice is to be constantly concerned with the niceness of the statements one makes, and seriously try to avoid giving offense. If I became blunt and plain-spoken it would hurt me in real life. It would not be worth it to me. Except in very rare situations (such as if I'm personally responsible for saving lives, and I have to be deliberately rude to do it.)
In the interests of rationality, I'll refrain in future from criticizing un-PC statements because they're "not nice." I don't want to confuse anyone. But I can't make those statements myself -- that comes at a cost I won't pay.
Do you ordinarily find that you have difficulty maintaining different registers(1) for different contexts?
If so, I sympathize and wish you luck in overcoming that difficulty. It is an enormously useful skill: as you say, being perceived as nice is valuable, and different communities perceive different kinds of behavior as nice, so you do best to learn to signal appropriately for different contexts (2).
If you don't have difficulty with this, though, then your comment puzzles me. If you agree with FKARoko that LW norms support "un-PC" (3) posts, or in any event ought to, then what cost are you concerned about... what's the cost? Conversely, if you don't agree with him, why refrain from criticizing un-PC statements... what confusion?
==
(1) I mean "register" in the linguistics sense.
(2) Unless, of course, you spend all your time in only one community.
(3) Caveat: I don't really understand what "PC" means; I'm using the term because it's the term you and FKARoko both use. I gather you use it here as synonymous with "nice," although in my own experience niceness often has more to do with how a statement is framed than what is actually being said.
not just for some, for all of us. It is in everyone's narrow self interest to sacrifice epistemology for signalling purposes. And in real life one has to do that. But here at least, I think that we should establish the opposite norm.
Look, what is the point of you trying to appear PC on LW? I for one am just not impressed. I already know that you're from a certain demographic that implies lots of good things about you. But it implies bad things about you if you can't turn off the the signalling BS in a context where it is socially very harmful, I.e. A rationality website.
Throughout the sequences it has been made clear that there usually is some local incentive for motivated cognition. Wanting to appear PC is no different: it's just another reason that people have for blowing their thought process up, with all the usual downsides, e.g. The downside that you often simply don't know what the cost will be because you would only be able to compute the cost of the motivated cognition if you were not engaging in it. Suffice it to say that I think we should have very strong norms against motivated cognition here on LW.
They got the SIAI funded.
The genome of the Ebola virus is a true fact about organisms. Yet it is dumb to state it on a microbiology forum. Besides, "if you don't agree you are dumb" is a statement that has to be backed by exceptional amounts of evidence. People who already disagree can only be convinced by evidence, if they are not intelligent enough to grasp the arguments.
There are cases where data or ideas can be really hazardous. I don't count "but it might hurt somebody's precious feelings" as one of those cases.
2+2=4 if you don't agree you are dumb.
A belief is irrational if you use irrational methods of thinking to obtain it. I consider most irrational beliefs to be the result of ignorance of or incompetence in the methods of rationality, rather than selfishness or malice. (I guess we could argue about whether anti-epistemology is an example of incompetence or of willful going-astray.)
I can't speak for Roko, but I imagine that on Less Wrong, almost all failures of rationality are the result of incompetence.
If I'm not able to understand my failure I still want to know if one thinks I am incompetent. I won't be able to understand how the person arrived at this conclusion, if it is due to a lack of intelligence on my side, but I'll be able to allow for the possibility and take it into account if I ever get stuck trying to reach a goal. So if someone honestly believes that I am too dumb he/she should say so and I won't perceive it as an insult. I just want to stress this point because he claimed that some posts, comments and the LW consensus about many things that real people deal with in the real world is actually (factually) wrong. He has to tell me because I'm not sure what he means, yet it is very important to know.
I understand what you're saying qualitatively; I was trying to get at your quantitative estimates. The numbers will constrain your optimal strategy for extracting value from the site.
For example,if for every "good" post there are N "table-thumping" ones and N=20, it's difficult-but-possible to find the "good" stuff. If N=200, it's effectively impossible. If N=2, it's pretty easy.
Conversely, at N=2 it is perhaps worth trying to convince the 2/3 majority to behave differently (the way you seem to be doing, sort of), but at N=20 you probably do better to figure out ways to flag the "good" 5%, concentrate your attention there, and allow the "table-thumpers" to play around on the less-valuable periphery in the hopes that maybe we'll be inspired by your good example. (At N=200 you probably do better to create a different site where the top .5% of LW-contributions can be hosted.)
You're right, of course, that this is a very imprecise way of talking about it. Given that I'm just asking about your off-the-cuff judgments rather than the results of your actual measurements, that seemed appropriate.
Qualitative or quantitative measures of the value of LW are an interesting thing to think about. AFAIK we don't have any at the moment.