Rationality is Not an Attractive Tribe

13 Post author: Alexandros 23 November 2010 02:08PM

Summary: I wonder how attractive rationality as a tribe and worldview is to the average person, when the competition is not constrained by verifiability or consistency and is therefore able optimize around offering imaginary status superstimuli to its adherents.


Anti-intellectualism has been a constant thread winding its way through our political and cultural life, nurtured by the false notion that democracy means that 'my ignorance is just as good as your knowledge.'

— Isaac Asimov

I've long been puzzled by the capability of people to reject obvious conclusions and opt for convoluted arguments that boil down to logical fallacies when it comes to defending a belief they have a stake in. When someone resists doing the math, despite an obvious capability to do so in other similar cases, we are right to suspect external factors at play. A framework that seems congruent with the evolutionary history of our species is that of beliefs as signals of loyalty to a tribe. Such a framework would explain the rejection of evolution and other scientific theories by large swathes of the world's population, especially religious population, despite access to a flood of evidence in support. 

I will leave support of the tribal signalling framework to others, and examine the consequences for popular support of rationality and science if indeed such a framework successfully approximates reality. The best way I can do that is by examining one popular alternative: The Christian religion which I am most familiar with, in particular its evangelical protestant branch. I am fairly confident that this narrative can be ported to other branches of Christianity and abrahamic faiths fairly easily and the equivalents for other large religions can be constructed with some extra effort.

"Blessed are the meek, for they will inherit the earth"

— The Bible (New International Version), Matthew 5:5

What is the narrative that an evangelical Christian buys into regarding their own status? They belong to the 'chosen people', worshipping a god that loves them, personally, created them with special care, and has a plan for their individual lives. They are taking part in a battle with absolute evil, that represents everything disgusting and despicable, which is manifested in the various difficulties they face in their lives. The end-game however is known. The believers, once their faith is tested in this world, are destined for an eternity of bliss with their brethren in the presence of their god, while the enemies will be thrown in the eternal fire for eternal torment. In this narrative, the disadvantaged in this life are very important. There exist propositions which can be held with absolute certainty. This presents a black-white divide in which moral judgements are easy, as long as corner cases can be swept under the rug. Each and every person, regardless of their social standing or capability, can be of utmost importance. Everyone can potentially save a soul for all eternity! In fact, the gospels place emphasis on the humble and the meek:

So those who are last now will be first then, and those who are first will be last.

— The Bible (New International Version), Matthew 20:16

What is the rational alternative to this battle-hardened, well-optimized worldview? That there is no grand narrative. If such a narrative exists, (pursuit of truth, combating existential risk, <insert yours here>), the stars of this narrative are those blessed with intelligence and education such that they can digest the background material and make these pursuits on the cutting edge. It turns out, your enemies are not innately evil, either. You may have just misunderstood each other. You have to constantly struggle to fight your own biases, to no certain outcome. In fact, we are to hold no proposition with 100% certainty. On the plus side, science and rationality offers, or at least aspires to offer, a consistent worldview free from cognitive dissonance for those that can detect the alternative's contradictions. On the status side, for those of high intelligence, it puts them at the top of the hierarchy, being in the line of the great heroes of thought that have gone before, uncovering all the knowledge we have so far. But this is not hard to perceive as elitism, especially since the barriers to entry are difficult, if not impossible, to overcome for the vast majority of humans. Rationality may have an edge if it can be shown to improve an individual's life prospects. I am not aware of such research, especially one that untangles rationality from intelligence. Perhaps the most successful example, Pick-up artists, are out of limits for this community because their terminals are deemed offensive. While we define rationality as the way to win, the win that we focus on in this community is a collective one, therefore unlikely to confer an individual with high status in the meantime if this individual does not belong to the intellectually gifted few.

So what does rationality have to offer to the common man to gain their support? The role of hard-working donor, whose contribution is in a replaceable commodity, e.g. money? The role of passive consumer of scientific products and documentaries? It seems to me that in the marketplace of worldview-tribes, rationality and science do not present themselves an attractive option for large swathes of the earth's population, and why would they? They were never developed as such. To make things worse, the alternatives have millennia of cultural evolution to better fit their targets, unconstrained by mundane burdens such as verifiability and consistency. I can perfectly see the attraction of the 'rational irrationality' point of view where someone would compartmentalises rationality into result-sensitive 'get things done' areas, while choosing to affirm unverifiable and/or incoherent propositions that nevertheless superstimulate one's feel-good status receptors.

I see two routes here: The one is that we decide that popular support is not necessary. We focus our recruiting efforts on the upper strata of intelligence and influence. If it's a narrative that they need, we can't help them. We're in the business of providing raw truth. Humans are barely on the brink of general intelligence, anyway. A recent post claimed that an IQ of 130+ was practically a prerequisite for appreciating and comprehending the sequences. The truths are hard to grasp and inconvenient, but ultimately it doesn't matter if a narrative can be developed for the common person. They can keep believing in creationism, and we'll save humanity for them anyway.

On the other hand, just because the scientific/rational worldview has not been fitted to the common man, it doesn't mean it can't be done. (But there is no guarantee that it can.) The alternative is to explore the open avenues that may lead to a more palatable narrative, including popularising many of the rationality themes that are articulated in this community. People show interest when I speak to them about cognitive biases but I have no accessible resources to give them that would start from there as a beachhead and progress into other more esoteric topics. And I don't find it incredible that rationality could provably aid in better individual outcomes, we just need solid research around the proposition. (The effects of various valleys of bad rationality or shifts in terminals due to rationality exposure may complicate that).

I am not taking a position on which course of action is superior, or that these are the only alternatives. But it does seem to me that, if my reasoning and assumptions are correct, we have to make up our mind on what exactly it is we want to do as the Less Wrong community.

Edit/Note: I initially planned for this to be posted as a normal article, but seeing how the voting is... equal in both directions, but that there is a lively discussion developing, I think this article is just fine in the discussion section.

Comments (105)

Comment author: [deleted] 23 November 2010 03:01:34PM *  12 points [-]

I'm not sure I like the dichotomy between "the common man" and "us." (Surely some people on LW are "common" with reference to their income or their education level.)

The thing is, scientific knowledge, at least, used to be a lot more widespread. I once found my grandmother's 8th grade science workbook (she was a child in the 1930's) and it was shockingly advanced! And very practical: they were diagramming wells and septic tanks. And we're not talking about a child of privilege here. She was a small-town girl, the daughter of a seamstress. In those days the Army didn't have to contract so many engineering firms because ordinary soldiers, if they'd been to high school, knew a little engineering already.

Honestly I think science vs. religion, rationality as an ideology, is no way to communicate with people who aren't already attracted to that way of thinking. But you can educate people so that they grow up doing things where rationality is important. Math, science, engineering, and to some extent argumentative writing and speaking. If you know how to do that at a 1930s-era level, then in some sense it doesn't matter if your church teaches fundamentalist religion: you'll compartmentalize it, say the words dutifully, and recognize that you can't expect day-to-day life to look like that. (As opposed to making actual financial decisions on the assumption that the Rapture will come within ten years. Yes, people really do that. They should stop.)

There's no real barrier to entry into thinking and behaving sensibly. You can do that no matter what your IQ is. And science is a lot more accessible than we assume today (I think poor education quality and low work ethic is more of a barrier than IQ.)

Comment author: Vaniver 23 November 2010 07:13:05PM *  7 points [-]

The problem, though, is that engineers are the most dangerous, from a rationality standpoint. (Edit- David_Gerard found it; I was thinking of Engineers and woo on rationalwiki.)

Essentially, engineers are good at compartmentalizing, but they're not good at figuring out truth. And so prominent creationists are almost all engineers. The 9/11 hijackers were almost all engineers. Engineers gone wrong are people who can execute a plan but can't prove a theorem- they use stuff because it works. And so when a fundamental, unquestioned belief that they've seen work is under attack, their impulse is not the scientist's impulse of "hm, let's see what's going on here."

That said, I do think increased science literacy would do a lot of people a lot of good.

Comment author: David_Gerard 23 November 2010 07:29:41PM *  5 points [-]

Reason as memetic immune disorder on LessWrong; Engineers and woo on RationalWiki. (The latter is mediocre and one day I'll get around to making it better.) Salem Hypothesis: "In any Evolution vs. Creation debate, A person who claims scientific credentials and sides with Creation will most likely have an Engineering degree."

The problem you describe is that engineers can get away with all manner of quite remarkable crankery as long as their engineering works.

One should keep in mind that the cranks are exceptional. Most engineers are perfectly normal geeks who respect science and mathematics as things that work independently of what humans think of them. However, engineer arrogance about fields not their own - and by extension, technologists in general - is stereotypical for a reason, and can easily slip into not understanding what the heck you're pontificating on. Biologist impatience with transhumanists' assertions is almost standard, for example.

Most don't take it as far as Andrew Schlafly of Conservapedia, who, despite having been an electrical engineer before he studied law, is deeply suspicious of the concept of complex numbers.

Comment author: JoshuaZ 23 November 2010 05:34:48PM 6 points [-]

How much of the focus on things like wells and septic tanks has been removed not due to dumbing down but due to increased specialization? I don't need to know how to make a septic tank or the like. And as technologies become more and more complicated it becomes more difficult for any person to know how all of them work in detail. Finally, having specialized individuals allows one to more efficiently use the law of comparative advantage.

Comment author: [deleted] 23 November 2010 05:43:31PM 6 points [-]

I think that may be part of an explanation; but while there are positive consequences to specialization, one of the negative consequences is that fewer people know basic science. That's probably bad for rationality.

Comment author: FormallyknownasRoko 24 November 2010 07:01:58PM *  2 points [-]

I'm not sure I like the dichotomy between "the common man" and "us."

Be careful to separate one's desire to signal how politically-correct one is from genuine epistemological issues, like how to carve reality at its natural joints. Even if some people here are of average IQ, it may still be natural to identify "LWer" with "smart", in the same sense one identifies "bird" with "fly".

When you are trying to have a discussion whose output is supposed to be a plan of action (or to contribute to such), it is silly to input signalling-motivated statements into the discussion; the result of planning using statements which were emitted for signalling purposes is that the plan is likely to fail.

Comment author: [deleted] 24 November 2010 08:12:59PM 4 points [-]

Ok, come to think of it, LWers have a higher IQ than the general population. That's kind of not in question.

But this article was basically saying that rationalism is unpopular because there's an IQ barrier. Most people with high IQ's aren't rational either, though. You can't say "we have a rationalist worldview because we're smart* if most smart people don't.

On political correctness -- it's relevant in almost every discussion. I'd only cut it out entirely in cases when something serious was staked on my clear judgment and honest communication. If you're on the crew of a sinking ship, and you think you know how to save it ... then yeah, you'd better cut the crap. But almost every other conversation also has a social purpose. Signaling isn't something you just turn off except on the rare occasions you meet a mundane.

Comment author: FormallyknownasRoko 24 November 2010 08:16:53PM *  2 points [-]

If you're on the crew of a sinking ship, and you think you know how to save it ... then yeah, you'd better cut the crap. But almost every other conversation also has a social purpose. Signaling isn't something you just turn off except on the rare occasions you meet a mundane.

So by default, talk on LW should include some fraction of statements whose purpose is to signal political correctness ? It seems to me that PC is a great way to make yourself irrational and horribly biased. Good for social signalling, useless for the martial art of rationality. Kind of like coming to the Dojo wearing high heels.

Some people here would say we are on a sinking ship (spaceship earth) and that this discussion is explicitly about how to save the ship by making the crew more sane. ;-0 But that may be stretching the metaphor.

Comment author: [deleted] 24 November 2010 08:27:57PM 3 points [-]

Well, put it this way: I think that saying "the reason most people don't agree with us is that they're just not smart enough" is a bit of a jerk thing to say. It doesn't represent us well and it's not nice. If you're absolutely convinced this is true, (and I think there's not enough evidence for that), you should be much more circumspect in how you say it. Yes, I want to be pro-nice and anti-jerk.

LW does not behave like the crew of a sinking ship. As a matter of observation, it just doesn't function that way. It's partly a social or discussion forum.

Comment author: AnneC 06 December 2010 10:27:51PM 2 points [-]

Re. "the reason most people don't agree with us is that they're just not smart enough"...totally aside from the question of whether this sort of sentiment is liable to be offputting to a lot of people, I've very often wondered whether anyone who holds such a sentiment is at all worried about the consequences of an "Emperor's New Clothes" effect.

What I mean by "Emperor's New Clothes" effect is that, regardless of what a person's actual views are on a given subject (or set of subjects), there's really nothing stopping said person from just copying the favored vocabulary, language patterns, stated opinions, etc., of those they see as the cleverest/most prominent/most respectable members of a community they want to join and be accepted in.

E.g., in self-described "rationalist" communities, I've noted that lots of people involved (a) value intelligence (however they define it) highly, and (b) appear to enjoy being acknowledged as clever themselves. The easiest way to do this, of course, is to parrot others that the community of interest clearly thinks are the Smartest of the Smart. And in some situations I suspect the "parroting" can occur involuntarily, just as a result of reading a lot of the writing of someone you like, admire, or respect intellectually, even if you may not have any real, deep understanding of what you are saying.

So my question is...does anyone even care about this possibility? Or are "communities" largely in the business of collecting members and advocates who can talk the talk, regardless of what their brains are actually doing behind the scenes?

Comment author: TheOtherDave 07 December 2010 03:39:37AM 3 points [-]

I suspect answers vary.

For my own part: if hordes of people who aren't really rationalists start adopting, for purely signaling reasons, the trappings of epistemic hygiene... if they start providing arguments in defense of their positions and admitting when those arguments are shown to be wrong, for example, not because of a genuine desire for the truth but merely because of a desire to be seen that way... if they start reliably articulating their biases and identifying the operation of biases in others, merely because that's the social norm... if they start tagging their assertions with confidence indicators and using those indicators consistently without actually having a deep-rooted commitment to avoiding implicitly overstating or understating their confidence... and so on and so forth...

...well, actually, I'd pretty much call that an unadulterated win. Sign me up for that future, please.

OTOH, if hordes of people just start talking about how smart and rational they are and how that makes them better than ordinary people, well, that's not worth much to me.

Comment author: FormallyknownasRoko 24 November 2010 09:28:40PM *  2 points [-]

"the reason most people don't agree with us is that they're just not smart enough" is a bit of a jerk thing to say. .... If you're absolutely convinced this is true

Why do I have to be "absolutely convinced"? Can't I give it a 90% credence and still say it? Or are we playing anti-epistemology and applying higher standards of evidence to statements that we find emotionally uncomfortable? Geez I leave for a few months and come back to find TEXTBOOK EXAMPLES of irrationality passing for LW debate! ;-)

Comment author: [deleted] 24 November 2010 10:17:32PM 5 points [-]

Ok, at this point I say "oops."

I basically posted without thinking too carefully; I knew I didn't like something about Alexandros' post. (Now that he's rephrased it it's starting to sound more plausible.) And I'm sorry if I came on too strong or insulted him personally.

Straight-up rational epistemology, where you're only concerned with truth, is ... a little unnatural to me, I have to admit. Doing it all the time (as opposed to just on special occasions when you heroically overcome your bias) would be a very different life.

Comment author: FormallyknownasRoko 24 November 2010 10:40:44PM 1 point [-]

Upvoted for honesty (we need more of this kind of thing I think)

I think you have a good point about image of LW -- we must be careful to present ourselves, that is I think something I tend to forget, there is, in fact a need for good image, as well as for good rationality

Comment author: Alexandros 24 November 2010 08:45:45PM *  1 point [-]

I think at this point I should clarify that the article didn't (intend to) say "the reason most people don't agree with us is that they're just not smart enough" but rather that for people without the capability to contribute to cutting edge maths, science, AI research and the like, our worldview is not that exciting and may therefore refuse to be convinced. Notice that it implies possible irrationality for many of the people who have joined thus far, as they commonly belong to the classes that our worldview values to a large extent. This includes myself.

Is it a nice thing to say? I personally do not feel comfortable bringing this stuff up and would prefer if things were differently. Perhaps the fact that I feel this way made this thought stand out as more urgent to discuss than many others that I have not bothered to post. In any case, this is the reality I perceive, and I've tried to be as inoffensive as possible while at the same time phrasing a coherent point. If anyone else is capable of expressing the core of this message in a less divisive way, they're welcome to do so.

Comment author: [deleted] 24 November 2010 08:47:30PM 1 point [-]

ok, clarified it makes more sense. I just extracted the wrong main point.

Comment author: TheOtherDave 25 November 2010 04:17:01AM 0 points [-]

It seems to me that whether LW should include signaling statements or not, it clearly does. Changing that will require some pretty heroic efforts.

But I've only been here a few weeks. You've been around and actively engaged for a while (albeit on what seems like it was an abrupt hiatus), so I'm interested in your judgment here.

To try and get a little more concrete: suppose you had to classify all the posts and comments on LW as +/- MAR (useful/useless for the martial art of rationality) and +/- SSA (useful/useless to signal social affiliations... including but not limited to "PC" in the mainstream sense).

What ratio of +MAR:-MAR do you think you'd end up with? What ratio of +SSA:-SSA? What ratio of +MAR:+SSA?

Are there easily identifiable subsets of posts for which you think you'd end up with ratios importantly different than those? (For example, posts by particular authors, top-level posts vs. comments, or whatever.)

What ratios would you expect from a community that was actively trying to "save a sinking ship"?

Comment author: FormallyknownasRoko 25 November 2010 01:00:00PM *  3 points [-]

To be more concrete about value for "Martial Art of Rationality" I think one would need a scalar measure, rather than a Yes/No. If you chose a Yes/No measure, your results would be very much dependent on where you drew the line. To answer the question you are asking in terms of ratios of +MAR:+SSA is really to talk about the distribution of posts in terms of rationality versus signalling.

I think that LW contains a few very very good posters, a cadre of very good posters, a few oddballs and a sea of people who sorta-kinda understand some stuff. And in many cases, it does contain posts and comments which shun rationality in order to thump the table in favour of some particular ideology or just general political correctness, which I have run foul of once or twice.

The problem, as I see it, is that if ideology gets to rule the roost on a whole host of important topics, then for those topics, LW becomes just another irrational, tribal internet echo chamber, where affirming the great chosen ideology becomes the most important task. However, I have been convinced that the benefit of having LW existing AT ALL outweighs this cost. To be plain, I think that the LW consensus is actually (factually) wrong about many things that real people deal with in the real world, but this is outweighed by the fact that LW is the only place in the world concerned with rationality.

Comment author: NancyLebovitz 25 November 2010 02:55:38PM *  3 points [-]

Voted up for citing posts rather than posters for lack of rationality .

I think I've seen some table-thumping for political incorrectness as well as for political correctness.

Comment author: XiXiDu 25 November 2010 02:17:32PM 1 point [-]

And in many cases, it does contain posts and comments which shun rationality in order to thump the table in favour of some particular ideology or just general political correctness, which I have run foul of once or twice.

So what you are saying here is that ideology is always irrational? Since this is a community blog devoted to refining rationality, why don't you address the particular points you believe do constitute the irrational consensus? Or are you saying that some individuals here know that their ideology is irrational yet shun rationality in favor of it? That could be better phrased as there are people here who follow selfish goals and argue based on matters of taste. But how do you know that those people are aware of it, that they do not honestly believe that their disguised ideology is actually rationality?

I would love to know which kind of posts and comments, and in particular what consensus, you are referring to. This is very important to me, so if you don't want to make it public I would like you to send me a short private message.

Comment author: FormallyknownasRoko 25 November 2010 06:28:05PM 4 points [-]

I'm not sure I like the dichotomy between "the common man" and "us."

AND

saying "the reason most people don't agree with us is that they're just not smart enough" is a bit of a jerk thing to say.

are examples of the kind of thing that I would regard as the problem. Other examples are even more inflammatory, but basically the all boil down to:

Person1: X is a true fact about the world

Person2: But saying X is mean to {political correctness brownie points group Y}, and besides, you can't be absolutely sure it's true/I won't believe it until you provide an impossibly high degree of evidence/we should stop talking about it or people will think we are mean!

The result is typically that LW can recite lots of rationalist principles, but when it comes to applying them to a significant number of real-world problems, LW is clueless.

Now one might reasonably argue that we don't need to be right about everything. Sure, LW is mired in PC BS about X,Y and Z, but topics A,B,C,D,E, ... which are also important are not subject to PC irrationality pressure. To an extent I buy this argument. However, reality is not a disconnected series of isolated topics: if you're wrong about X,Y and Z you might make incorrect inferences about all sorts of other things.

Comment author: [deleted] 25 November 2010 11:50:10PM 6 points [-]

For some of us, being perceived as nice is one of the most important ways we can help ourselves in real life. And the best way to be consistently perceived as nice is to be constantly concerned with the niceness of the statements one makes, and seriously try to avoid giving offense. If I became blunt and plain-spoken it would hurt me in real life. It would not be worth it to me. Except in very rare situations (such as if I'm personally responsible for saving lives, and I have to be deliberately rude to do it.)

In the interests of rationality, I'll refrain in future from criticizing un-PC statements because they're "not nice." I don't want to confuse anyone. But I can't make those statements myself -- that comes at a cost I won't pay.

Comment author: TheOtherDave 26 November 2010 09:00:29AM 2 points [-]

Do you ordinarily find that you have difficulty maintaining different registers(1) for different contexts?

If so, I sympathize and wish you luck in overcoming that difficulty. It is an enormously useful skill: as you say, being perceived as nice is valuable, and different communities perceive different kinds of behavior as nice, so you do best to learn to signal appropriately for different contexts (2).

If you don't have difficulty with this, though, then your comment puzzles me. If you agree with FKARoko that LW norms support "un-PC" (3) posts, or in any event ought to, then what cost are you concerned about... what's the cost? Conversely, if you don't agree with him, why refrain from criticizing un-PC statements... what confusion?

==

(1) I mean "register" in the linguistics sense.

(2) Unless, of course, you spend all your time in only one community.

(3) Caveat: I don't really understand what "PC" means; I'm using the term because it's the term you and FKARoko both use. I gather you use it here as synonymous with "nice," although in my own experience niceness often has more to do with how a statement is framed than what is actually being said.

Comment author: FormallyknownasRoko 26 November 2010 08:08:52AM *  2 points [-]

being perceived as nice is one of the most important ways we can help ourselves in real life

not just for some, for all of us. It is in everyone's narrow self interest to sacrifice epistemology for signalling purposes. And in real life one has to do that. But here at least, I think that we should establish the opposite norm.

Look, what is the point of you trying to appear PC on LW? I for one am just not impressed. I already know that you're from a certain demographic that implies lots of good things about you. But it implies bad things about you if you can't turn off the the signalling BS in a context where it is socially very harmful, I.e. A rationality website.

Throughout the sequences it has been made clear that there usually is some local incentive for motivated cognition. Wanting to appear PC is no different: it's just another reason that people have for blowing their thought process up, with all the usual downsides, e.g. The downside that you often simply don't know what the cost will be because you would only be able to compute the cost of the motivated cognition if you were not engaging in it. Suffice it to say that I think we should have very strong norms against motivated cognition here on LW.

Comment author: XiXiDu 25 November 2010 07:30:26PM *  2 points [-]

The result is typically that LW can recite lots of rationalist principles, but when it comes to applying them to a significant number of real-world problems, LW is clueless.

They got the SIAI funded.

Person1: X is a true fact about the world

Person2: But saying X is mean

The genome of the Ebola virus is a true fact about organisms. Yet it is dumb to state it on a microbiology forum. Besides, "if you don't agree you are dumb" is a statement that has to be backed by exceptional amounts of evidence. People who already disagree can only be convinced by evidence, if they are not intelligent enough to grasp the arguments.

Comment author: FormallyknownasRoko 25 November 2010 07:38:57PM 1 point [-]

The genome of the Ebola virus is a true fact about organisms. Yet it is dumb to state it on a microbiology forum.

There are cases where data or ideas can be really hazardous. I don't count "but it might hurt somebody's precious feelings" as one of those cases.

Comment author: FormallyknownasRoko 25 November 2010 07:37:04PM 0 points [-]

"if you don't agree you are dumb" is a statement that has to be backed by exceptional amounts of evidence.

2+2=4 if you don't agree you are dumb.

Comment author: Nisan 25 November 2010 03:54:01PM 0 points [-]

A belief is irrational if you use irrational methods of thinking to obtain it. I consider most irrational beliefs to be the result of ignorance of or incompetence in the methods of rationality, rather than selfishness or malice. (I guess we could argue about whether anti-epistemology is an example of incompetence or of willful going-astray.)

I can't speak for Roko, but I imagine that on Less Wrong, almost all failures of rationality are the result of incompetence.

Comment author: XiXiDu 25 November 2010 04:13:51PM *  0 points [-]

I imagine that on Less Wrong, almost all failures of rationality are the result of incompetence.

If I'm not able to understand my failure I still want to know if one thinks I am incompetent. I won't be able to understand how the person arrived at this conclusion, if it is due to a lack of intelligence on my side, but I'll be able to allow for the possibility and take it into account if I ever get stuck trying to reach a goal. So if someone honestly believes that I am too dumb he/she should say so and I won't perceive it as an insult. I just want to stress this point because he claimed that some posts, comments and the LW consensus about many things that real people deal with in the real world is actually (factually) wrong. He has to tell me because I'm not sure what he means, yet it is very important to know.

Comment author: TheOtherDave 26 November 2010 11:24:55PM 0 points [-]

I understand what you're saying qualitatively; I was trying to get at your quantitative estimates. The numbers will constrain your optimal strategy for extracting value from the site.

For example,if for every "good" post there are N "table-thumping" ones and N=20, it's difficult-but-possible to find the "good" stuff. If N=200, it's effectively impossible. If N=2, it's pretty easy.

Conversely, at N=2 it is perhaps worth trying to convince the 2/3 majority to behave differently (the way you seem to be doing, sort of), but at N=20 you probably do better to figure out ways to flag the "good" 5%, concentrate your attention there, and allow the "table-thumpers" to play around on the less-valuable periphery in the hopes that maybe we'll be inspired by your good example. (At N=200 you probably do better to create a different site where the top .5% of LW-contributions can be hosted.)

You're right, of course, that this is a very imprecise way of talking about it. Given that I'm just asking about your off-the-cuff judgments rather than the results of your actual measurements, that seemed appropriate.

Comment author: FormallyknownasRoko 26 November 2010 11:41:29PM *  1 point [-]

Qualitative or quantitative measures of the value of LW are an interesting thing to think about. AFAIK we don't have any at the moment.

Comment author: David_Gerard 23 November 2010 05:49:31PM *  0 points [-]

There was a post I can't find which addressed this point: that rationality has worked so well that society can now support a vast number of people acting irrationally. And so rationality or irrationality is no longer a matter of survival or even practicality, but of social signaling: you get ahead in many social environments more with irrationality than with rationality.

tl;dr: IT PAYS TO BE STUPID.

Anyone got the link to hand? It should go in this essay.

Comment author: Alexandros 24 November 2010 05:59:29PM 1 point [-]

That would be extremely interesting to read (though it sets my mind going in all sorts of depressing directions). If anyone can track this down, I'd be very interested in reading it.

Comment author: Alexandros 23 November 2010 03:27:20PM *  0 points [-]

Thanks for the feedback.

The division between 'us' and 'the common man' is along the lines of raw intelligence and education in the broad sense (not the narrow academic sense). I am not comfortable with it myself but there seems to be evidence that there is some barrier to entry, especially if you want to contribute to advancing the state of the art. (I cannot locate the article where Eliezer set out the requirements for someone helping to program the friendly AI)

Also, I am not speaking of rationality as a set of techniques, but as a worldview that is informed by atheism and the findings of science at the very least. I should perhaps make this more clear in the article or choose another term.

Comment author: [deleted] 23 November 2010 03:48:45PM 3 points [-]

Is this what you were looking for? If so, I don't think it should be included in your post. Eliezer was talking about being a Seed AI programmer, not a rational thinker. You certainly don't have to be a supergenius to try to improve your own rationality.

Comment author: Nick_Tarleton 24 November 2010 07:09:45AM 1 point [-]

Not to mention, that piece is years (not sure how many) out of date.

Comment author: Alexandros 24 November 2010 05:45:52PM 1 point [-]

I don't think Eliezer's requirements have been revised to anything significantly (relative to the general population) more inclusive. Not making a negative judgement on this, he may well be right to do so. But I'm fairly confident this is the case.

Comment author: Alexandros 23 November 2010 03:57:11PM 0 points [-]

Thanks. As I said, this is a barrier to contributing on the cutting edge. More appropriate however is the this article by Louie, citing IQ 130+ and an “NT” (Rational) MBTI as prerequisites for understanding the sequences

Comment author: luminosity 23 November 2010 10:58:24PM 3 points [-]

Except that, really, there is no evidence presented there that you need either of these prerequisites to understand the sequences. They're just criteria that Louie arbitrarily decided were important.

Comment author: Alexandros 24 November 2010 05:53:39PM *  0 points [-]

I think he'd argue the LW reader surveys that show a concentration on the extreme upper region of the intelligence spectrum justify his claim.

Now, I think that the people willing and able to comprehend sequences are fewer than the people willing and able to comprehend rationality, but the question is how much Return On Investment there is to be had in working to reach more and more of the people in the second group who are not in the first.

Comment author: Perplexed 23 November 2010 05:32:43PM 2 points [-]

Also, I am not speaking of rationality as a set of techniques, but as a worldview that is informed by atheism and the findings of science at the very least.

Then perhaps the word you want is "skepticism" rather than "rationalism". Rationalists, as I understand it, do not define themselves by a worldview (and definitely not a worldview 'informed' by doctrines and findings). An atheism embraced so as to become a member of a community is as anathema to a rationalist as would be a theism embraced for the same reason.

Comment author: Alexandros 23 November 2010 05:38:34PM *  0 points [-]

Alas, skepticism fits even less, as it is merely an outlook. In this community however, atheism is treated as an open-and-shut issue and I suspect most would say that they expect a rational person after considering the evidence on both sides to come down on the side of atheism. After all, the latest survey showed that LWers willing to fill in a survey were 80% atheist. Perhaps I should clarify that I mean weak (no belief a god exists) atheism, not strong (belief no god exists) atheism.

Regardless, nothing should be 'embraced so as to become member of a community', including vanilla rationality (scientific method? bayes?). That is a fundamental conflict of interest that all communities face and in may cases are destroyed by. This is exactly the reason why things like the 'existential risk career network' scare me quite a bit, especially if they become known as ways to get a lucrative job.

Comment author: Jack 23 November 2010 06:34:28PM 5 points [-]

Perhaps I should clarify that I mean weak (no belief a god exists) atheism, not strong (belief no god exists) atheism.

Not if you're going to endorse Bayes in the next sentence you shouldn't :-)

Comment author: Perplexed 24 November 2010 02:45:03PM 2 points [-]

I'm not sure I understand this. Could you clarify? Are you saying that a true Bayesian doesn't think there is a distinction? That a wise Bayesian will be neither kind of atheist?

Comment author: Jack 24 November 2010 04:17:19PM *  2 points [-]

So Bayesian epistemology doesn't actually make use of the word 'belief', instead we just assign probabilities to hypotheses. You don't believe or not believe, you just estimate p. So the distinction isn't really intelligible. I guess one could interpret weak atheist as implying a higher probability of God's existence than a strong atheist... but it doesn't obviously translate that way and isn't something a Bayesian would say.

Comment author: Perplexed 24 November 2010 04:55:40PM 2 points [-]

Got it. Thx.

I suppose someone could claim that a strong atheist actually sets P(God) = 0. Whereas a weak atheist sets P(God) = some small epsilon. But then a Bayesian shouldn't become a strong atheist.

Comment author: komponisto 24 November 2010 03:22:11PM 0 points [-]

See here.

Comment author: Perplexed 24 November 2010 03:41:01PM 0 points [-]

I'm not sure I understand this. Could you clarify?

I'm not looking to start an argument here. I don't need to hear reasons. I just want to know what Jack meant when he responded to "Perhaps I should clarify ..." with "Not if you are going to endorse Bayes."

Comment author: Perplexed 24 November 2010 02:42:14PM 0 points [-]

As to whether "skepticism" names a worldview, an outlook, or some pieces of a methodology - apparently there is some current controversy on that.

Comment author: NancyLebovitz 23 November 2010 03:22:13PM *  8 points [-]

Rationality is not the same as intelligence, and I'm hoping that one of the spin-offs from Less Wrong is finding less challenging ways to explain how to use the knowledge and the brains you've got.

Keeping an eye out for exceptions to what you think you know and considering what those exceptions might mean isn't a complicated idea.

Neither is the idea that everyone does things for reasons which make sense to them.

Internalizing such ideas may be emotionally challenging, but that's a different problem.

Comment author: Alexandros 23 November 2010 03:36:36PM 2 points [-]

In large part I'm dealing with this instinctive/emotional barrier of not only adopting counterintuitive beliefs, but also leaving the worldview you inhabit for another, which may be much less developed.

I do think it's possible to boil down the material to simpler form. There was a time when the pythagorean theorem was the pinnacle of human thought, no doubt beyond the reach of the average person. Same for Newton's work. Perhaps it takes a long time for cultural digestion of such concepts to find their accessible forms? Perhaps culture itself is hindering people from grasping what is ultimately simple? Or maybe the newest findings of QM etc. really are beyond the reach of certain people?

Comment author: NancyLebovitz 23 November 2010 04:46:50PM *  3 points [-]

There's a difference between understanding the latest findings of QM and rationality.

I wonder to what extent people give up on thinking because of an educational system which discourages them. As far as I can tell, thinking isn't really taught-- the ability to think (and memorize and comply) is rewarded, which is a very different matter.

I think you've got rationality and intelligence bundled together too tightly, though I agree that there are probably thresholds of intelligence needed for particular insights.

And I'm pretty sure that one the reasons rationality has a bad rep is the unnecessary " but you aren't smart enough to play" attitude that sometimes comes with it.

In large part I'm dealing with this instinctive/emotional barrier of not only adopting counterintuitive beliefs, but also leaving the worldview you inhabit for another, which may be much less developed.

There was a recent post (sorry, no time to hunt it down) about how to evaluate new ideas that look cool so that you don't accidentally screw up your life. This should definitely be taught as part of rationality.

Comment author: Vaniver 23 November 2010 07:01:51PM 0 points [-]

Perhaps it takes a long time for cultural digestion of such concepts to find their accessible forms?

Intellectual development doesn't seem to be a matter of time so much as it is man-hours, if that division makes sense. I suspect that if Eliezer was a psychologist and/or educator instead of a computer scientist, we would be looking at a "rationality for the everyman" project instead of SIAI / LessWrong.

So, what we need is for someone to take the problem of "how do I explain rationality to actual people with limited time" and and work at it. HP:MoR is a start (it's explaining rationality to a subset of Harry Potter fans, at least) but it's not set up to give the right feedback.

Comment author: Perplexed 23 November 2010 05:17:35PM 7 points [-]

Perhaps the most successful example, Pick-up artists, are out of limits for this community because their terminals are deemed offensive.

"out of limits for this community"? Huh? PUA fans are a significant and vocal part of this community. Sure, they receive some criticism, but so do cryonics advocates, utilitarians, believers in the 'scary idea', and one-boxers. There is no consensus on terminal values here, nor even, beyond a vague Bayesianism, on the algorithms and methods of rationality. Only an agreement that such things are important and a loose sense of community in helping each other to explore these questions.

So what does rationality have to offer to the common man to gain their support?

I think this is a good question. Most inspirational rationalist manifestos appeal more to intellectuals than to your 'common man'. I would like to see more attempts to answer this question. My own suggestion would be to focus on the Litany of Tarski. Our slogan should be "Don't let yourself be treated as a fool." The trouble is that the conspiracy theory salesmen had already established a strong foothold in this market.

Comment author: Alexandros 23 November 2010 05:29:54PM *  0 points [-]

I recall there was a PUA-related fuss early in the site's life and the outcome was for articles to be discouraged (though not outright banned). Can anyone confirm or deny this?

Comment author: Perplexed 23 November 2010 05:44:20PM 1 point [-]

I'm relatively new here, so I can neither confirm nor deny. But it seems that you are saying that practical articles like this one are discouraged. Why would anyone discourage that? :)

Comment author: Larks 23 November 2010 07:30:59PM 7 points [-]

There was, in the Summer of '09. This post is quite good, as Alicorn presents a good summary, and links to other prominent top-level posts.

Essentially, a lot of words were spake in anger, and PUA seemed to be the mind-killer. Eventually, everyone dropped the topic, but I'm not sure anyone really won. If you'll forgive the crass labels, the anti-PUA side semi-banished PUA from LW, but the pro-PUA side ensured that a lot of people still find it useful/etc., and simply decline from discussing it out of concern for the peace.

Comment author: Perplexed 24 November 2010 01:14:42AM 3 points [-]

Thanks.

<tiptoes away quietly>

Comment author: nerzhin 23 November 2010 06:57:19PM 6 points [-]

We all enjoy beating up on the silly evangelical Christians, but that is dangerous. Let's try to be just a bit more charitable.

What is the narrative that an evangelical Christian buys into regarding their own status? [...] They are taking part in a battle with absolute evil, that represents everything disgusting and despicable, which is manifested in the various difficulties they face in their lives. [...] This presents a black-white divide in which moral judgements are easy.

Christians view their status as sinful. There isn't (usually) some battle between the perfect Christian and an external "absolute evil". Instead, the battle is primarily internal, and the Christian views their own motives and actions with suspicion. Because the motives are always suspect, there is no clear black-white divide and moral judgments are never easy.

This is not too terribly different from what you describe as the rationalist struggle:

You have to constantly struggle to fight your own biases, to no certain outcome.

Comment author: Vaniver 23 November 2010 07:27:24PM 5 points [-]

I think most people make a division between "Christianity done normally" and "Christianity done well", just like one can make that division for rationality. I agree with you that they should be explicit that they're talking about the stereotype of "standard" Christians instead of "correct" Christians.

Because when you look at "standard" Christians, the "we're all sinners" is generally used as an excuse, not a motivation. "Hey, you can't expect me to be perfect!" Instead of actually improving, you just have to want to improve.

Indeed, I might even separate Christianity done well into "Christian rationalism" or something similar, because the similarities are rather strong.

Comment author: Desrtopa 24 November 2010 03:49:49PM 1 point [-]

If we're talking about evangelical Christians, the prevailing view is that they are sinful, but forgiven. They believe that they're not perfect, but that their imperfections have already been excused.

This gives us two points on which rationality fails to be as appealing. First, evangelical Christians don't have to doubt their understanding, they believe they know what it would look like if they were perfect, although they lack the fortitude to achieve it. Second, they have a forgiving entity to appeal to when they get things wrong.

Comment author: FormallyknownasRoko 23 November 2010 07:42:50PM *  4 points [-]

I see two routes here: ... We focus our recruiting efforts on the upper strata of intelligence and influence ... the common person can keep believing in creationism, and we'll save humanity for them anyway ...

On the other hand ... explore the open avenues that may lead to a more palatable narrative, including popularising many of the rationality themes

For a long time I have been thinking that both of these options are so awful that we should think very hard about third alternatives before we analyse a fixed set of solutions.

Comment author: FormallyknownasRoko 23 November 2010 07:43:53PM *  4 points [-]

Just thinking out loud: it would be really nice if you could sort of "abandon the idiots to live in the hell that their irrationality creates", and have a country that was just high-IQ types (speaking of which, my IQ is less than 130, and I think most here agree that I did, in fact, understand the sequences). Rationality correlates with IQ (though weakly), but society seems to be very much a nonlinear aggregator of individual traits. I don't know how plausible it is to create a high-IQ high rationality country. You would want to start with a country that was already on that track, e.g. Japan, Israel, South Korea.

EDIT: this is probably an even more awful idea than the two presented in the post. I just wanted to open the floor to third alternatives.

Comment author: Carinthium 23 November 2010 11:10:57PM 2 points [-]

The most plausible route (to the extent one is possible) to that conclusion would be to buy a couple of islands for a select elite, then expand from there.

Comment author: FormallyknownasRoko 24 November 2010 09:40:49AM *  1 point [-]

I heard that the seasteading folk have looked into that already and apparently governments are loth to give up actual sovereignty, though of course they will happily take your $ in exchange for "ownership", where "ownership" means that the island is still subject to their laws.

My "suggestion" isn't really very good, I really just made it to make clear what kind of thing a third alternative would look like.

Comment author: Carinthium 24 November 2010 11:08:50AM 0 points [-]

I know that- I said it was the most plausible. A government in severe financial trouble and willing to sell off a minor island is more likely than a sucessful revolt or persuading people to accept a rationalist government.

Comment author: [deleted] 24 November 2010 12:38:24PM 1 point [-]

All of these options are assuming the need to be tied to land, where present governments and populations hold sway and pose difficulties. However, there is much open and unowned space at sea. Even if we ignore future projects for self-sustaining cities floating in the oceans and what-not, the basic idea of a community of people living on ships which spend most of their time in open water strikes me as a relatively plausible solution (i.e. on about the same level as the other proposals). It seems to me like The World is a useful proof of concept here. Now, if a bunch of ships could sail together with a population large enough that it might be considered a moving country...

Comment author: FormallyknownasRoko 24 November 2010 06:27:41PM *  3 points [-]

The problem I have with this, and my objection to seasteading in general, is that living at sea is really difficult. So difficult that I think the disadvantage levied by seasteading will swamp the advantage(s) of whatever else you are trying to do, e.g. rationality, libertarianism, etc etc.

And if, at some stage, seasteading becomes manage-ably difficult, then everyone will quickly get in on it,and you could squeezed out of the game by existing nations who have big navies and want your sea-space.

If you think that existing nations wouldn't stamp on you and take all the sea-space, then why do you think they are so loth to sell and give up sovereignty over even the worst and most useless pieces of land that they own?

Comment author: XiXiDu 25 November 2010 12:25:45PM *  1 point [-]

...speaking of which, my IQ is less than 130, and I think most here agree that I did, in fact, understand the sequences...

Isn't the only person able to judge this the person who wrote the sequences? Or at least someone who was told that he/she understands the sequences by the one who wrote them.

"abandon the idiots to live in the hell that their irrationality creates"

Humans have a tendency for self-inflicted malaise. Humans value the freedom to do what they want more than doing what they would want to do from a retrospective alternative point of view that values their extrapolated volition. But why would it be rational to choose the future over the present? Hell is that which you don't want at present not what you might not want if you were a different person.

Comment author: FormallyknownasRoko 25 November 2010 01:04:26PM 0 points [-]

But why would it be rational to choose the future over the present?

It isn't rational to choose the future over the present. It isn't irrational either. Time discount is a free parameter to be chosen according to axiology. At least it would be nice if all the low-time discount people could get together in one place, and leave all the high-discount people together.

Comment author: Desrtopa 24 November 2010 03:34:45PM 1 point [-]

At the risk of causing this to devolve into a discussion of politics, I'm not clear on how Japan, Israel, or South Korea are supposed to look like good startup points in this regard.

Comment author: FormallyknownasRoko 24 November 2010 06:23:10PM 0 points [-]

The idea was just that they are more than averagely interested in tech, and israel is also higher than average IQ because of the ashkenazi jews.

Comment author: Desrtopa 24 November 2010 06:38:37PM *  1 point [-]

These qualities don't seem to have allowed any of them to run their countries exceptionally well. If you want to construct a high rationality country, I don't think that any of these three would be particularly good starting points.

Japan at least manages to rank highly on many societal health indices, but their educational system tends to emphasize rote memorization and prestigious affiliation over original thinking and personal achievement.

Between countries, differences in mean intelligence are almost certainly going to be trivial in comparison with differences in social values. You'd want to look, not at which countries tend to have smarter people, but which promote more rationalist-friendly values.

Comment author: FormallyknownasRoko 24 November 2010 06:53:15PM 0 points [-]

Any ideas for that?

Comment author: Desrtopa 24 November 2010 06:58:16PM 2 points [-]

My off the cuff answer is that you might want to try looking at various Scandinavian countries, but since I don't think the basic idea is particularly plausible, it's not something I've invested a great deal of consideration in.

Comment author: juliawise 24 August 2011 04:25:21PM 0 points [-]

Scandinavia has a problematic culture of humility: Jante Law

Comment author: Emile 24 November 2010 07:31:11AM 2 points [-]

Start a religion you don't really believe in but that pushes humanity in the right direction.

Comment author: Vaniver 23 November 2010 07:22:32PM 3 points [-]

I agree with you that the "you must be this smart to ride" signs in LW comments are problematic. I'm not sure they're wrong, though. I've never taken an IQ test, but from other tests am at least 98th percentile (which is ~130). Most of the stuff I've read on LW I either agree with naturally or have thought out reasons why I disagree with it- but a lot of that comes from my reflectivity and speed of thought / reading. So LW can be a hobby for me, whereas it would be a massive time investment for someone else.

And, honestly, when I think about "what class would I love to add to the curriculum," I wouldn't start with rationality or skepticism. Critical thinking is a useful skill but it's dangerous- oftentimes, half educated is worse than not. I would start with non-violence; teaching people methodology that they can use to get along better with others. That seems like the highest impact skill you could teach people.

But I still would love to see a "rationality for the everyman" project. You seem to be talking about creating a rationality narrative- which may be problematic for the reasons you outline- but a rationality skillset should be doable.

Comment author: urbanespaceman 23 November 2010 03:20:52PM 0 points [-]

Minor proofreading correction: Second to last para: People show interest(ed)

I can't help thinking that route 1 dooms us. Since the planet isn't run by the intellectual elite. Indeed I'm fairly sure I read a study recently which showed the intellectual elite shy away from politics - not becuase they want to get involved but because it is now more about who can make the mud stick than it is about issues. That would mean the world would be steered by the sub-90 IQs rather than the 130-plus IQs. So "popular" support is necessary, to ensure that rational solutions actually get to see the light of day!

Comment author: Emile 23 November 2010 04:05:11PM 4 points [-]

That would mean the world would be steered by the sub-90 IQs rather than the 130-plus IQs.

I would be very surprised if that was actually the case.

Comment author: orthonormal 23 November 2010 06:14:27PM 3 points [-]

I'd heard, rather, that your average national-level politician is smart but not too smart- between 120 and 140, typically. There's indirect selection for intelligence in campaigns. But yes, there does seem to be some kind of selection against politics for the very smart.

Comment author: Alexandros 23 November 2010 03:31:33PM 0 points [-]

Thanks, typo fixed.