Summary: I wonder how attractive rationality as a tribe and worldview is to the average person, when the competition is not constrained by verifiability or consistency and is therefore able optimize around offering imaginary status superstimuli to its adherents.


Anti-intellectualism has been a constant thread winding its way through our political and cultural life, nurtured by the false notion that democracy means that 'my ignorance is just as good as your knowledge.'

— Isaac Asimov

I've long been puzzled by the capability of people to reject obvious conclusions and opt for convoluted arguments that boil down to logical fallacies when it comes to defending a belief they have a stake in. When someone resists doing the math, despite an obvious capability to do so in other similar cases, we are right to suspect external factors at play. A framework that seems congruent with the evolutionary history of our species is that of beliefs as signals of loyalty to a tribe. Such a framework would explain the rejection of evolution and other scientific theories by large swathes of the world's population, especially religious population, despite access to a flood of evidence in support. 

I will leave support of the tribal signalling framework to others, and examine the consequences for popular support of rationality and science if indeed such a framework successfully approximates reality. The best way I can do that is by examining one popular alternative: The Christian religion which I am most familiar with, in particular its evangelical protestant branch. I am fairly confident that this narrative can be ported to other branches of Christianity and abrahamic faiths fairly easily and the equivalents for other large religions can be constructed with some extra effort.

"Blessed are the meek, for they will inherit the earth"

— The Bible (New International Version), Matthew 5:5

What is the narrative that an evangelical Christian buys into regarding their own status? They belong to the 'chosen people', worshipping a god that loves them, personally, created them with special care, and has a plan for their individual lives. They are taking part in a battle with absolute evil, that represents everything disgusting and despicable, which is manifested in the various difficulties they face in their lives. The end-game however is known. The believers, once their faith is tested in this world, are destined for an eternity of bliss with their brethren in the presence of their god, while the enemies will be thrown in the eternal fire for eternal torment. In this narrative, the disadvantaged in this life are very important. There exist propositions which can be held with absolute certainty. This presents a black-white divide in which moral judgements are easy, as long as corner cases can be swept under the rug. Each and every person, regardless of their social standing or capability, can be of utmost importance. Everyone can potentially save a soul for all eternity! In fact, the gospels place emphasis on the humble and the meek:

So those who are last now will be first then, and those who are first will be last.

— The Bible (New International Version), Matthew 20:16

What is the rational alternative to this battle-hardened, well-optimized worldview? That there is no grand narrative. If such a narrative exists, (pursuit of truth, combating existential risk, <insert yours here>), the stars of this narrative are those blessed with intelligence and education such that they can digest the background material and make these pursuits on the cutting edge. It turns out, your enemies are not innately evil, either. You may have just misunderstood each other. You have to constantly struggle to fight your own biases, to no certain outcome. In fact, we are to hold no proposition with 100% certainty. On the plus side, science and rationality offers, or at least aspires to offer, a consistent worldview free from cognitive dissonance for those that can detect the alternative's contradictions. On the status side, for those of high intelligence, it puts them at the top of the hierarchy, being in the line of the great heroes of thought that have gone before, uncovering all the knowledge we have so far. But this is not hard to perceive as elitism, especially since the barriers to entry are difficult, if not impossible, to overcome for the vast majority of humans. Rationality may have an edge if it can be shown to improve an individual's life prospects. I am not aware of such research, especially one that untangles rationality from intelligence. Perhaps the most successful example, Pick-up artists, are out of limits for this community because their terminals are deemed offensive. While we define rationality as the way to win, the win that we focus on in this community is a collective one, therefore unlikely to confer an individual with high status in the meantime if this individual does not belong to the intellectually gifted few.

So what does rationality have to offer to the common man to gain their support? The role of hard-working donor, whose contribution is in a replaceable commodity, e.g. money? The role of passive consumer of scientific products and documentaries? It seems to me that in the marketplace of worldview-tribes, rationality and science do not present themselves an attractive option for large swathes of the earth's population, and why would they? They were never developed as such. To make things worse, the alternatives have millennia of cultural evolution to better fit their targets, unconstrained by mundane burdens such as verifiability and consistency. I can perfectly see the attraction of the 'rational irrationality' point of view where someone would compartmentalises rationality into result-sensitive 'get things done' areas, while choosing to affirm unverifiable and/or incoherent propositions that nevertheless superstimulate one's feel-good status receptors.

I see two routes here: The one is that we decide that popular support is not necessary. We focus our recruiting efforts on the upper strata of intelligence and influence. If it's a narrative that they need, we can't help them. We're in the business of providing raw truth. Humans are barely on the brink of general intelligence, anyway. A recent post claimed that an IQ of 130+ was practically a prerequisite for appreciating and comprehending the sequences. The truths are hard to grasp and inconvenient, but ultimately it doesn't matter if a narrative can be developed for the common person. They can keep believing in creationism, and we'll save humanity for them anyway.

On the other hand, just because the scientific/rational worldview has not been fitted to the common man, it doesn't mean it can't be done. (But there is no guarantee that it can.) The alternative is to explore the open avenues that may lead to a more palatable narrative, including popularising many of the rationality themes that are articulated in this community. People show interest when I speak to them about cognitive biases but I have no accessible resources to give them that would start from there as a beachhead and progress into other more esoteric topics. And I don't find it incredible that rationality could provably aid in better individual outcomes, we just need solid research around the proposition. (The effects of various valleys of bad rationality or shifts in terminals due to rationality exposure may complicate that).

I am not taking a position on which course of action is superior, or that these are the only alternatives. But it does seem to me that, if my reasoning and assumptions are correct, we have to make up our mind on what exactly it is we want to do as the Less Wrong community.

Edit/Note: I initially planned for this to be posted as a normal article, but seeing how the voting is... equal in both directions, but that there is a lively discussion developing, I think this article is just fine in the discussion section.

New Comment
107 comments, sorted by Click to highlight new comments since: Today at 10:00 AM
Some comments are truncated due to high volume. (⌘F to expand all)Change truncation settings
[-][anonymous]13y150

I'm not sure I like the dichotomy between "the common man" and "us." (Surely some people on LW are "common" with reference to their income or their education level.)

The thing is, scientific knowledge, at least, used to be a lot more widespread. I once found my grandmother's 8th grade science workbook (she was a child in the 1930's) and it was shockingly advanced! And very practical: they were diagramming wells and septic tanks. And we're not talking about a child of privilege here. She was a small-town girl, the daughter of a seamstress. In those days the Army didn't have to contract so many engineering firms because ordinary soldiers, if they'd been to high school, knew a little engineering already.

Honestly I think science vs. religion, rationality as an ideology, is no way to communicate with people who aren't already attracted to that way of thinking. But you can educate people so that they grow up doing things where rationality is important. Math, science, engineering, and to some extent argumentative writing and speaking. If you know how to do that at a 1930s-era level, then in some sense it doesn't matter if your church teaches fundament... (read more)

8Vaniver13y
The problem, though, is that engineers are the most dangerous, from a rationality standpoint. (Edit- David_Gerard found it; I was thinking of Engineers and woo on rationalwiki.) Essentially, engineers are good at compartmentalizing, but they're not good at figuring out truth. And so prominent creationists are almost all engineers. The 9/11 hijackers were almost all engineers. Engineers gone wrong are people who can execute a plan but can't prove a theorem- they use stuff because it works. And so when a fundamental, unquestioned belief that they've seen work is under attack, their impulse is not the scientist's impulse of "hm, let's see what's going on here." That said, I do think increased science literacy would do a lot of people a lot of good.
8David_Gerard13y
Reason as memetic immune disorder on LessWrong; Engineers and woo on RationalWiki. (The latter is mediocre and one day I'll get around to making it better.) Salem Hypothesis: "In any Evolution vs. Creation debate, A person who claims scientific credentials and sides with Creation will most likely have an Engineering degree." The problem you describe is that engineers can get away with all manner of quite remarkable crankery as long as their engineering works. One should keep in mind that the cranks are exceptional. Most engineers are perfectly normal geeks who respect science and mathematics as things that work independently of what humans think of them. However, engineer arrogance about fields not their own - and by extension, technologists in general - is stereotypical for a reason, and can easily slip into not understanding what the heck you're pontificating on. Biologist impatience with transhumanists' assertions is almost standard, for example. Most don't take it as far as Andrew Schlafly of Conservapedia, who, despite having been an electrical engineer before he studied law, is deeply suspicious of the concept of complex numbers.
6JoshuaZ13y
How much of the focus on things like wells and septic tanks has been removed not due to dumbing down but due to increased specialization? I don't need to know how to make a septic tank or the like. And as technologies become more and more complicated it becomes more difficult for any person to know how all of them work in detail. Finally, having specialized individuals allows one to more efficiently use the law of comparative advantage.
7[anonymous]13y
I think that may be part of an explanation; but while there are positive consequences to specialization, one of the negative consequences is that fewer people know basic science. That's probably bad for rationality.
3Roko13y
Be careful to separate one's desire to signal how politically-correct one is from genuine epistemological issues, like how to carve reality at its natural joints. Even if some people here are of average IQ, it may still be natural to identify "LWer" with "smart", in the same sense one identifies "bird" with "fly". When you are trying to have a discussion whose output is supposed to be a plan of action (or to contribute to such), it is silly to input signalling-motivated statements into the discussion; the result of planning using statements which were emitted for signalling purposes is that the plan is likely to fail.
5[anonymous]13y
Ok, come to think of it, LWers have a higher IQ than the general population. That's kind of not in question. But this article was basically saying that rationalism is unpopular because there's an IQ barrier. Most people with high IQ's aren't rational either, though. You can't say "we have a rationalist worldview because we're smart* if most smart people don't. On political correctness -- it's relevant in almost every discussion. I'd only cut it out entirely in cases when something serious was staked on my clear judgment and honest communication. If you're on the crew of a sinking ship, and you think you know how to save it ... then yeah, you'd better cut the crap. But almost every other conversation also has a social purpose. Signaling isn't something you just turn off except on the rare occasions you meet a mundane.
4Roko13y
So by default, talk on LW should include some fraction of statements whose purpose is to signal political correctness ? It seems to me that PC is a great way to make yourself irrational and horribly biased. Good for social signalling, useless for the martial art of rationality. Kind of like coming to the Dojo wearing high heels. Some people here would say we are on a sinking ship (spaceship earth) and that this discussion is explicitly about how to save the ship by making the crew more sane. ;-0 But that may be stretching the metaphor.
3[anonymous]13y
Well, put it this way: I think that saying "the reason most people don't agree with us is that they're just not smart enough" is a bit of a jerk thing to say. It doesn't represent us well and it's not nice. If you're absolutely convinced this is true, (and I think there's not enough evidence for that), you should be much more circumspect in how you say it. Yes, I want to be pro-nice and anti-jerk. LW does not behave like the crew of a sinking ship. As a matter of observation, it just doesn't function that way. It's partly a social or discussion forum.
4Roko13y
Why do I have to be "absolutely convinced"? Can't I give it a 90% credence and still say it? Or are we playing anti-epistemology and applying higher standards of evidence to statements that we find emotionally uncomfortable? Geez I leave for a few months and come back to find TEXTBOOK EXAMPLES of irrationality passing for LW debate! ;-)
6[anonymous]13y
Ok, at this point I say "oops." I basically posted without thinking too carefully; I knew I didn't like something about Alexandros' post. (Now that he's rephrased it it's starting to sound more plausible.) And I'm sorry if I came on too strong or insulted him personally. Straight-up rational epistemology, where you're only concerned with truth, is ... a little unnatural to me, I have to admit. Doing it all the time (as opposed to just on special occasions when you heroically overcome your bias) would be a very different life.
1Roko13y
Upvoted for honesty (we need more of this kind of thing I think) I think you have a good point about image of LW -- we must be careful to present ourselves, that is I think something I tend to forget, there is, in fact a need for good image, as well as for good rationality
3AnneC13y
Re. "the reason most people don't agree with us is that they're just not smart enough"...totally aside from the question of whether this sort of sentiment is liable to be offputting to a lot of people, I've very often wondered whether anyone who holds such a sentiment is at all worried about the consequences of an "Emperor's New Clothes" effect. What I mean by "Emperor's New Clothes" effect is that, regardless of what a person's actual views are on a given subject (or set of subjects), there's really nothing stopping said person from just copying the favored vocabulary, language patterns, stated opinions, etc., of those they see as the cleverest/most prominent/most respectable members of a community they want to join and be accepted in. E.g., in self-described "rationalist" communities, I've noted that lots of people involved (a) value intelligence (however they define it) highly, and (b) appear to enjoy being acknowledged as clever themselves. The easiest way to do this, of course, is to parrot others that the community of interest clearly thinks are the Smartest of the Smart. And in some situations I suspect the "parroting" can occur involuntarily, just as a result of reading a lot of the writing of someone you like, admire, or respect intellectually, even if you may not have any real, deep understanding of what you are saying. So my question is...does anyone even care about this possibility? Or are "communities" largely in the business of collecting members and advocates who can talk the talk, regardless of what their brains are actually doing behind the scenes?
4TheOtherDave13y
I suspect answers vary. For my own part: if hordes of people who aren't really rationalists start adopting, for purely signaling reasons, the trappings of epistemic hygiene... if they start providing arguments in defense of their positions and admitting when those arguments are shown to be wrong, for example, not because of a genuine desire for the truth but merely because of a desire to be seen that way... if they start reliably articulating their biases and identifying the operation of biases in others, merely because that's the social norm... if they start tagging their assertions with confidence indicators and using those indicators consistently without actually having a deep-rooted commitment to avoiding implicitly overstating or understating their confidence... and so on and so forth... ...well, actually, I'd pretty much call that an unadulterated win. Sign me up for that future, please. OTOH, if hordes of people just start talking about how smart and rational they are and how that makes them better than ordinary people, well, that's not worth much to me.
2Alexandros13y
I think at this point I should clarify that the article didn't (intend to) say "the reason most people don't agree with us is that they're just not smart enough" but rather that for people without the capability to contribute to cutting edge maths, science, AI research and the like, our worldview is not that exciting and may therefore refuse to be convinced. Notice that it implies possible irrationality for many of the people who have joined thus far, as they commonly belong to the classes that our worldview values to a large extent. This includes myself. Is it a nice thing to say? I personally do not feel comfortable bringing this stuff up and would prefer if things were differently. Perhaps the fact that I feel this way made this thought stand out as more urgent to discuss than many others that I have not bothered to post. In any case, this is the reality I perceive, and I've tried to be as inoffensive as possible while at the same time phrasing a coherent point. If anyone else is capable of expressing the core of this message in a less divisive way, they're welcome to do so.
1[anonymous]13y
ok, clarified it makes more sense. I just extracted the wrong main point.
0TheOtherDave13y
It seems to me that whether LW should include signaling statements or not, it clearly does. Changing that will require some pretty heroic efforts. But I've only been here a few weeks. You've been around and actively engaged for a while (albeit on what seems like it was an abrupt hiatus), so I'm interested in your judgment here. To try and get a little more concrete: suppose you had to classify all the posts and comments on LW as +/- MAR (useful/useless for the martial art of rationality) and +/- SSA (useful/useless to signal social affiliations... including but not limited to "PC" in the mainstream sense). What ratio of +MAR:-MAR do you think you'd end up with? What ratio of +SSA:-SSA? What ratio of +MAR:+SSA? Are there easily identifiable subsets of posts for which you think you'd end up with ratios importantly different than those? (For example, posts by particular authors, top-level posts vs. comments, or whatever.) What ratios would you expect from a community that was actively trying to "save a sinking ship"?
5Roko13y
To be more concrete about value for "Martial Art of Rationality" I think one would need a scalar measure, rather than a Yes/No. If you chose a Yes/No measure, your results would be very much dependent on where you drew the line. To answer the question you are asking in terms of ratios of +MAR:+SSA is really to talk about the distribution of posts in terms of rationality versus signalling. I think that LW contains a few very very good posters, a cadre of very good posters, a few oddballs and a sea of people who sorta-kinda understand some stuff. And in many cases, it does contain posts and comments which shun rationality in order to thump the table in favour of some particular ideology or just general political correctness, which I have run foul of once or twice. The problem, as I see it, is that if ideology gets to rule the roost on a whole host of important topics, then for those topics, LW becomes just another irrational, tribal internet echo chamber, where affirming the great chosen ideology becomes the most important task. However, I have been convinced that the benefit of having LW existing AT ALL outweighs this cost. To be plain, I think that the LW consensus is actually (factually) wrong about many things that real people deal with in the real world, but this is outweighed by the fact that LW is the only place in the world concerned with rationality.
3NancyLebovitz13y
Voted up for citing posts rather than posters for lack of rationality . I think I've seen some table-thumping for political incorrectness as well as for political correctness.
1XiXiDu13y
So what you are saying here is that ideology is always irrational? Since this is a community blog devoted to refining rationality, why don't you address the particular points you believe do constitute the irrational consensus? Or are you saying that some individuals here know that their ideology is irrational yet shun rationality in favor of it? That could be better phrased as there are people here who follow selfish goals and argue based on matters of taste. But how do you know that those people are aware of it, that they do not honestly believe that their disguised ideology is actually rationality? I would love to know which kind of posts and comments, and in particular what consensus, you are referring to. This is very important to me, so if you don't want to make it public I would like you to send me a short private message.
7Roko13y
AND are examples of the kind of thing that I would regard as the problem. Other examples are even more inflammatory, but basically the all boil down to: Person1: X is a true fact about the world Person2: But saying X is mean to {political correctness brownie points group Y}, and besides, you can't be absolutely sure it's true/I won't believe it until you provide an impossibly high degree of evidence/we should stop talking about it or people will think we are mean! The result is typically that LW can recite lots of rationalist principles, but when it comes to applying them to a significant number of real-world problems, LW is clueless. Now one might reasonably argue that we don't need to be right about everything. Sure, LW is mired in PC BS about X,Y and Z, but topics A,B,C,D,E, ... which are also important are not subject to PC irrationality pressure. To an extent I buy this argument. However, reality is not a disconnected series of isolated topics: if you're wrong about X,Y and Z you might make incorrect inferences about all sorts of other things.
7[anonymous]13y
For some of us, being perceived as nice is one of the most important ways we can help ourselves in real life. And the best way to be consistently perceived as nice is to be constantly concerned with the niceness of the statements one makes, and seriously try to avoid giving offense. If I became blunt and plain-spoken it would hurt me in real life. It would not be worth it to me. Except in very rare situations (such as if I'm personally responsible for saving lives, and I have to be deliberately rude to do it.) In the interests of rationality, I'll refrain in future from criticizing un-PC statements because they're "not nice." I don't want to confuse anyone. But I can't make those statements myself -- that comes at a cost I won't pay.
3TheOtherDave13y
Do you ordinarily find that you have difficulty maintaining different registers(1) for different contexts? If so, I sympathize and wish you luck in overcoming that difficulty. It is an enormously useful skill: as you say, being perceived as nice is valuable, and different communities perceive different kinds of behavior as nice, so you do best to learn to signal appropriately for different contexts (2). If you don't have difficulty with this, though, then your comment puzzles me. If you agree with FKARoko that LW norms support "un-PC" (3) posts, or in any event ought to, then what cost are you concerned about... what's the cost? Conversely, if you don't agree with him, why refrain from criticizing un-PC statements... what confusion? == (1) I mean "register" in the linguistics sense. (2) Unless, of course, you spend all your time in only one community. (3) Caveat: I don't really understand what "PC" means; I'm using the term because it's the term you and FKARoko both use. I gather you use it here as synonymous with "nice," although in my own experience niceness often has more to do with how a statement is framed than what is actually being said.
1[anonymous]13y
For clarity's sake: PC means politically correct and usually refers to political inoffensiveness. The term isn't really apt for the current discussion because there was no talk of politics. I don't know if I'm great at code-switching. I can tell that LW is "not PC" or blunt-spoken. But the thing is, when some heuristic is good for you in most of your life, you may internalize it and simply make it a constant feature of your personality. For example, if it's usually a bad idea for you to use swear words, you may be better off just not swearing at all, even when you're in the saloon and swearing would be socially appropriate. You may want to personally identify as a non-curser. It makes double-sure that you'll never swear at the wrong time. If you don't trust yourself to be socially agile in switching from situation to situation, then I think "better safe than sorry" makes sense.
3TheOtherDave13y
Agreed as far as it goes. But that's a big "if." The social agility you're talking about is an important life skill. If I spend some time in contexts where a particular behavior has social benefits and some time in contexts where the same behavior has social costs, then I get the best results by staying aware of the context that I'm in and behaving appropriately. That said, I do appreciate that it's harder for some people than others. If I can't do that, the next-best thing is to construct a superposition of rulesets and always apply it. This is similar to what you're suggesting here... if the costs of cursing in the no-curse environments are much higher than the benefits of cursing in the yes-curse environments, adopting a "don't curse regardless of context" rule as you suggest can work OK. My point is, it's a second-best option. Paying attention to my environment as it changes and responding accordingly has better payoffs, if I can manage it.
2Roko13y
not just for some, for all of us. It is in everyone's narrow self interest to sacrifice epistemology for signalling purposes. And in real life one has to do that. But here at least, I think that we should establish the opposite norm. Look, what is the point of you trying to appear PC on LW? I for one am just not impressed. I already know that you're from a certain demographic that implies lots of good things about you. But it implies bad things about you if you can't turn off the the signalling BS in a context where it is socially very harmful, I.e. A rationality website. Throughout the sequences it has been made clear that there usually is some local incentive for motivated cognition. Wanting to appear PC is no different: it's just another reason that people have for blowing their thought process up, with all the usual downsides, e.g. The downside that you often simply don't know what the cost will be because you would only be able to compute the cost of the motivated cognition if you were not engaging in it. Suffice it to say that I think we should have very strong norms against motivated cognition here on LW.

As I see it, the problem there is that saying "we shouldn't be affected by this stuff" does not mean that we aren't affected by this stuff. Knowing your cognitive biases allows for workarounds - it doesn't cause them not to exist.

In particular, saying to others "you're smart people, you should not be affected by such nuances" and then not bothering to put them into place oneself is almost a cliched way to come across as an arsehole on the Internet and have people not want to bother listening to the speaker, no matter how right they may be. The message communicated is not "you should be affected less", but "I am inept." This reduces one's effectiveness.

Postel's law: "Be conservative in what you send; be liberal in what you accept."

If someone posts like a raging arsehole, they can be as right as they like, but people still won't welcome them or want to listen to them. It's not as effective a communication strategy as thinking before typing: your aim is to get the effect you want, not to win the conversation.

I speak here as a (hopefully) recovering arsehole. I have no plans to compromise the accuracy of what I'm saying, but it is useful to say it in a way that doesn't repel people from even reading.

1Roko13y
Sorry, I don't understand you. Who said that someone should not be affected by cognitive biases?
9[anonymous]13y
I know you're not impressed. I know folks around here don't like it much. I'm glad there are such folks who say what they think without signaling. I respect that attitude, and it's partly because I respect it that I'm here. I do want to know what people think when they're solely concerned with accuracy. But I don't really want to imitate them -- maybe a little, but not thoroughly. Truth is, I used to be socially awkward. These days, I'm not, but it's not because I'm any cleverer at dealing with people, it's because I've adopted a persona that's all about being, let's say, harmless. Positive and gentle. Trying to please. It's kind of a good all-purpose heuristic -- if I make some kind of faux pas, people will think "oh, she's clueless, but she's nice." I'm good with nice-but-clueless. And if you really want to be 100% nice-but-clueless, you have to be that way all the time. It's not just political PC -- I make a deliberate point of, as much as possible, never thinking or speaking badly of anyone. Not even in private. Not even in forums where the opposite norm holds. Once you start down that path, there's a chance that you might be bitchy in public. And you can't really afford that if you have other flaws and weaknesses, I think; I need people to forgive me my mistakes. Would it be worth it to change? As you point out, I can't know, because I'm within the world of motivated cognition. That said, I can think of circumstances where I probably ought to change -- if I worked in the private sector, for example, or if I chose an advisor who really values frankness (both live possibilities.) There may come a point where "nice but clueless" stops working for me. And then I'll really have to take this stuff seriously. But I have no idea who I'll be, once I'm not nice-but-clueless.
8NancyLebovitz13y
A couple of times here, I've run into guys who find nice really annoying. It was useful for me to be a good bit blunter than unusual with them, and I'm inclined to believe that the experiment in flexibility was good for me. The problem, I think, is that nice involves such a light touch that for some people, it fails to make contact. I like nice. I prefer nice. And I think it's got some very definite limits.
5Roko13y
Then I think that you are in grave danger of getting pretty badly screwed by someone. There are genuinely bad people in this world, and there are lots of kind-of-bad people who will screw you over and rationalize it somehow. You have to have a healthy skepticism (not paranoia) about people's motives, it's the only way to prevent someone taking your money or your job/house etc. Seriously, forget the darned debate: if what you say is true, you are probably in serious danger of being taken advantage of in some way. If I were you, I would seriously consider trying to improve your social skills the hard way, i.e. by learning social skills, and not engaging in potentially massively self-harming motivated cognition.
4[anonymous]13y
You may be right there.
0TheOtherDave13y
Ah... I should have read this before replying to what you said elsewhere. So, you're aware that presenting as "nice but clueless" works against you in communities where cluelessness isn't a point in your favor, but you prefer to optimize for the communities where it is. OK, fair enough: that's your choice to make.
1[anonymous]13y
I'm not sure, really. I'm open to changing my mind. I may have to, after all.
3TheOtherDave13y
I doubt you'll ever "have to," in the sense of being forced to by circumstances. That's what I meant by it being your choice to make. Plenty of people live their entire lives optimizing for minimizing social friction at the cost of expressing their thoughts clearly and unambiguously... presenting as "nice but clueless," in other words. "Going along to get along" is another way to say it. I suspect that as long as you make the choice to do so, you will be able to find situations that allow you to, just like they do. That's what value judgments are for, after all: they let you construct a preference order among possible states of the world, and therefore drive the choices you make. The decision to present as "nice but clueless" will affect the sorts of acquaintances you make, the sorts of communities you join, the sorts of organizations you work for, and so forth. To put it differently: like it or not, you actually have a lot of power over your own future. So the question is, how confident are you in the preference order you're defending? If you're confident in it, then great... you're choosing the world you want, which is as it should be, and I wish you joy of it. OTOH, if you are uncertain, then I suggest that you might do better to explore the roots of that uncertainty yourself, rather than wait for events to somehow force you to change your mind.
-1[anonymous]13y
I like that attitude. It is also not irrational because you are aware of it and deliberately choose to be that way. I believe that Less Wrong features a way too much ought. I don't disagree with the consensus on Cryonics at all, yet I'm not getting a contract because I'm too lazy and I like to be lazy. My usual credo is, I can't lose as long as I don't leave my way. That doesn't mean I am stubborn. I allow myself to alter my way situational. Rationality is about winning and what constitutes winning is purely subjective. If you don't care if the universe is tiled with paperclips rather than being filled with apes having sex under the stars, that is completely rational as long as you are aware what exactly you care or don't care about.
2andreas13y
Do you think that your beliefs regarding what you care about could be mistaken? That you might tell yourself that you care more about being lazy than about getting cryonics done, but that in fact, under reflection, you would prefer to get the contract?
-2[anonymous]13y
I can't solve that problem right now. It implies that part of my volition is not, in fact, part of what I want or should not be part of my goals. Why would I only listen to the part of my inner self favoring long-term decisions? I could take the car to drive to that Christmas party to visit my family and friends, or I could stay home because of black ice. After all there will be many more Christmas parties without black ice in future, and even more in the far future where there will be backups? But where does this thinking lead? I want both of course. On reflection, not dying is more important than party. But on further reflection I do not have enough data that would allow me to conclude that any long-term payoff could outweigh extensive restraint at present. There are also some practical considerations about Cryonics. I am in Germany, I don't know of any Cryonics companies here. I don't know what is the likelihood of being frozen quickly enough in case of accident. When I know I'm going to die in advance then I can still get a contract then. So is the money really worth it, given that most pathways to death result in no expected benefits from a Cryonics contract?
7XiXiDu13y
Because this is not a private mailing list? Imagine some scientist or politician came here to get a dose of rationality just to come across a discussion where someone argues that he knows more and then tells everyone to keep their idiot mouths shut? This happened on Less Wrong and the person who said so might have even been factually correct. Besides that this caused some uproar and damaged Less Wrong it is also a bad way of communicating truth and rationality. Stating conclusions like that is not a way to refine rationality. People do not come here to learn facts, e.g. that they are dumb, but how to arrive at such factual conclusions.
3Roko13y
IMO if a top politician or scientist came here and found politically correct BS as the standard ideology on this so called "rational" website, they would probably sigh and close the page never to return. Why should they? They have better things to do with their time than listen to BS. On the other hand, I don't think they would be impressed if we didn't have the skill to frame potentially inflammatory facts in a delicate way. I am not arguing against careful, delicate framing. I am arguing against MOTIVATED COGNITION.
5XiXiDu13y
I'm not suggesting that Less Wrong should conceal the truth to schmooze certain ideologies. What I am suggesting is that Less Wrong is NOT about teaching people how to score Karma points on Less Wrong but in the real world. * Less Wrong has to be able to apply rationality in a reconcilable dose rate. * Less Wrong has to keep care that it does not shut itself up in its own ivory-tower. * Less Wrong has to be focused on teaching utilizable rationality skills. Motivated cognition can be a double-edged sword. If you overcompensate against political correctness you can easily end up pursuing an introversive self-image that leads to ingroup bias. Less Wrong has to be in an equilibrium of internal affairs and public relations.
2Roko13y
Political correctness bias is not the cure to ingroup bias. If you have an ingroup bias problem, you solve the ingroup bias problem with the usual rationality tactics -- like being honest about the weaknesses of the ingroup. As far as I can tell, the best path is to vigorously fight PC bias and ingroup bias. You can have both. Really.
2XiXiDu13y
They got the SIAI funded. The genome of the Ebola virus is a true fact about organisms. Yet it is dumb to state it on a microbiology forum. Besides, "if you don't agree you are dumb" is a statement that has to be backed by exceptional amounts of evidence. People who already disagree can only be convinced by evidence, if they are not intelligent enough to grasp the arguments.
1Roko13y
There are cases where data or ideas can be really hazardous. I don't count "but it might hurt somebody's precious feelings" as one of those cases.
1XiXiDu13y
I just came across this:
0Roko13y
This seems to be neither here nor there as regards the present debate. I assign some probability to security in obscurity working for bio, some to it not working.
0Roko13y
2+2=4 if you don't agree you are dumb.
2XiXiDu13y
Let me clarify my last comment. It is really all about what we want. We just have to accept that Less Wrong is not only about refining rationality. Less Wrong also won't be able to refine rationality if it allows the discussion of some topics in great detail, as they risk the future of this platform. So every statement here has to be taken with a grain of salt and to be put and understood in a larger context. Proclaiming the truth might be rational if you value rationality in and of itself. But since rationality is about winning you have to ask for what constitutes winning. The answer to this question is ultimately ideological and about matters of taste.
3Roko13y
Again, you are being logically rude. I refuted (I think) the idea that ""if you don't agree you are dumb" is a statement that has to be backed by exceptional amounts of evidence.". Don't switch the goalposts mid-debate. Admit that, in fact, there are some statements such that if you disagree with them, you are dumb, no massive dossier of evidence required.
2XiXiDu13y
So what is it that you are trying to argue which I evade? I don't think that you can generalize from the example of avoiding to signal the intellectual superiority of LW to the general issue of political correctness. Some factual statements are simply bad arguments to use in a debate. I'm not being logically rude, I'm just trying to argue that political correctness and epistemological issues are not necessarily mutually exclusive. Further, if you want to output a plan for action you better tweak it for real world use, which naturally must include some signaling. Only afterwards one is able to tackle the more fundamental issues of the general rationality of political correctness, e.g. overcoming human nature. I do not think that you have refuted it. I also believed that part of your argument was to assert that we sometimes shouldn't keep quiet about the truth, whatever the consequences. I do not agree with that either. Telling people they are dumb means that you are sufficiently sure that 1.) you are right 2.) they are wrong and not just more demanding (more evidence, different kinds of evidence etc.) 3.) the reason for that they disagree is that they are intellectually inferior. Further, even if you are sure someone is dumb, it is still a really bad argument as it is not persuasive. If someone is dumb you have to be even smarter to convince that person. If you just proclaim someone is dumb, maybe you are not as smart as you thought either. Some people don't know that they are alive. Does that mean that they are dumb? Eliezer Yudkowsky might be able to rationalize such a disorder because of all his background knowledge. But would he be able to do so if he grew up without being able to acquire his current set of skills? A lot of one's potential intelligence is unleashed due to certain environmental circumstances, e.g. an advanced education. There are indeed people who do possess less potential. Yet if we want to make them aware of their shortcomings it is not rati
2XiXiDu13y
Yes, but if we are talking about real world problems then we have to deal with people who are dumb and sometimes we also have to convince them to get what we want. It is rational to limit the truth output of a forum of truth-seekers. An analogy would be the intolerance of intolerance. To maximize tolerance you have to be intolerant of intolerance. This is also the case with rationality as you won't be able to make the world a more rational place by telling the irrational folks the truth, namely that they are irrational, that would just result in more irrational behavior.
1Roko13y
You are being logically rude. Please don't!
0Nisan13y
A belief is irrational if you use irrational methods of thinking to obtain it. I consider most irrational beliefs to be the result of ignorance of or incompetence in the methods of rationality, rather than selfishness or malice. (I guess we could argue about whether anti-epistemology is an example of incompetence or of willful going-astray.) I can't speak for Roko, but I imagine that on Less Wrong, almost all failures of rationality are the result of incompetence.
0XiXiDu13y
If I'm not able to understand my failure I still want to know if one thinks I am incompetent. I won't be able to understand how the person arrived at this conclusion, if it is due to a lack of intelligence on my side, but I'll be able to allow for the possibility and take it into account if I ever get stuck trying to reach a goal. So if someone honestly believes that I am too dumb he/she should say so and I won't perceive it as an insult. I just want to stress this point because he claimed that some posts, comments and the LW consensus about many things that real people deal with in the real world is actually (factually) wrong. He has to tell me because I'm not sure what he means, yet it is very important to know.
0TheOtherDave13y
I understand what you're saying qualitatively; I was trying to get at your quantitative estimates. The numbers will constrain your optimal strategy for extracting value from the site. For example,if for every "good" post there are N "table-thumping" ones and N=20, it's difficult-but-possible to find the "good" stuff. If N=200, it's effectively impossible. If N=2, it's pretty easy. Conversely, at N=2 it is perhaps worth trying to convince the 2/3 majority to behave differently (the way you seem to be doing, sort of), but at N=20 you probably do better to figure out ways to flag the "good" 5%, concentrate your attention there, and allow the "table-thumpers" to play around on the less-valuable periphery in the hopes that maybe we'll be inspired by your good example. (At N=200 you probably do better to create a different site where the top .5% of LW-contributions can be hosted.) You're right, of course, that this is a very imprecise way of talking about it. Given that I'm just asking about your off-the-cuff judgments rather than the results of your actual measurements, that seemed appropriate.
2Roko13y
Qualitative or quantitative measures of the value of LW are an interesting thing to think about. AFAIK we don't have any at the moment.
1David_Gerard13y
There was a post I can't find which addressed this point: that rationality has worked so well that society can now support a vast number of people acting irrationally. And so rationality or irrationality is no longer a matter of survival or even practicality, but of social signaling: you get ahead in many social environments more with irrationality than with rationality. tl;dr: IT PAYS TO BE STUPID. Anyone got the link to hand? It should go in this essay.
1Alexandros13y
That would be extremely interesting to read (though it sets my mind going in all sorts of depressing directions). If anyone can track this down, I'd be very interested in reading it.
0Alexandros13y
Thanks for the feedback. The division between 'us' and 'the common man' is along the lines of raw intelligence and education in the broad sense (not the narrow academic sense). I am not comfortable with it myself but there seems to be evidence that there is some barrier to entry, especially if you want to contribute to advancing the state of the art. (I cannot locate the article where Eliezer set out the requirements for someone helping to program the friendly AI) Also, I am not speaking of rationality as a set of techniques, but as a worldview that is informed by atheism and the findings of science at the very least. I should perhaps make this more clear in the article or choose another term.
5[anonymous]13y
Is this what you were looking for? If so, I don't think it should be included in your post. Eliezer was talking about being a Seed AI programmer, not a rational thinker. You certainly don't have to be a supergenius to try to improve your own rationality.
2Nick_Tarleton13y
Not to mention, that piece is years (not sure how many) out of date.
2Alexandros13y
I don't think Eliezer's requirements have been revised to anything significantly (relative to the general population) more inclusive. Not making a negative judgement on this, he may well be right to do so. But I'm fairly confident this is the case.
0Alexandros13y
Thanks. As I said, this is a barrier to contributing on the cutting edge. More appropriate however is the this article by Louie, citing IQ 130+ and an “NT” (Rational) MBTI as prerequisites for understanding the sequences
3luminosity13y
Except that, really, there is no evidence presented there that you need either of these prerequisites to understand the sequences. They're just criteria that Louie arbitrarily decided were important.
0Alexandros13y
I think he'd argue the LW reader surveys that show a concentration on the extreme upper region of the intelligence spectrum justify his claim. Now, I think that the people willing and able to comprehend sequences are fewer than the people willing and able to comprehend rationality, but the question is how much Return On Investment there is to be had in working to reach more and more of the people in the second group who are not in the first.
2Perplexed13y
Then perhaps the word you want is "skepticism" rather than "rationalism". Rationalists, as I understand it, do not define themselves by a worldview (and definitely not a worldview 'informed' by doctrines and findings). An atheism embraced so as to become a member of a community is as anathema to a rationalist as would be a theism embraced for the same reason.
-1Alexandros13y
Alas, skepticism fits even less, as it is merely an outlook. In this community however, atheism is treated as an open-and-shut issue and I suspect most would say that they expect a rational person after considering the evidence on both sides to come down on the side of atheism. After all, the latest survey showed that LWers willing to fill in a survey were 80% atheist. Perhaps I should clarify that I mean weak (no belief a god exists) atheism, not strong (belief no god exists) atheism. Regardless, nothing should be 'embraced so as to become member of a community', including vanilla rationality (scientific method? bayes?). That is a fundamental conflict of interest that all communities face and in may cases are destroyed by. This is exactly the reason why things like the 'existential risk career network' scare me quite a bit, especially if they become known as ways to get a lucrative job.
7Jack13y
Not if you're going to endorse Bayes in the next sentence you shouldn't :-)
2Perplexed13y
I'm not sure I understand this. Could you clarify? Are you saying that a true Bayesian doesn't think there is a distinction? That a wise Bayesian will be neither kind of atheist?
2Jack13y
So Bayesian epistemology doesn't actually make use of the word 'belief', instead we just assign probabilities to hypotheses. You don't believe or not believe, you just estimate p. So the distinction isn't really intelligible. I guess one could interpret weak atheist as implying a higher probability of God's existence than a strong atheist... but it doesn't obviously translate that way and isn't something a Bayesian would say.
2Perplexed13y
Got it. Thx. I suppose someone could claim that a strong atheist actually sets P(God) = 0. Whereas a weak atheist sets P(God) = some small epsilon. But then a Bayesian shouldn't become a strong atheist.
0komponisto13y
See here.
0Perplexed13y
I'm not sure I understand this. Could you clarify? I'm not looking to start an argument here. I don't need to hear reasons. I just want to know what Jack meant when he responded to "Perhaps I should clarify ..." with "Not if you are going to endorse Bayes."
0Perplexed13y
As to whether "skepticism" names a worldview, an outlook, or some pieces of a methodology - apparently there is some current controversy on that.

Rationality is not the same as intelligence, and I'm hoping that one of the spin-offs from Less Wrong is finding less challenging ways to explain how to use the knowledge and the brains you've got.

Keeping an eye out for exceptions to what you think you know and considering what those exceptions might mean isn't a complicated idea.

Neither is the idea that everyone does things for reasons which make sense to them.

Internalizing such ideas may be emotionally challenging, but that's a different problem.

3Alexandros13y
In large part I'm dealing with this instinctive/emotional barrier of not only adopting counterintuitive beliefs, but also leaving the worldview you inhabit for another, which may be much less developed. I do think it's possible to boil down the material to simpler form. There was a time when the pythagorean theorem was the pinnacle of human thought, no doubt beyond the reach of the average person. Same for Newton's work. Perhaps it takes a long time for cultural digestion of such concepts to find their accessible forms? Perhaps culture itself is hindering people from grasping what is ultimately simple? Or maybe the newest findings of QM etc. really are beyond the reach of certain people?
4NancyLebovitz13y
There's a difference between understanding the latest findings of QM and rationality. I wonder to what extent people give up on thinking because of an educational system which discourages them. As far as I can tell, thinking isn't really taught-- the ability to think (and memorize and comply) is rewarded, which is a very different matter. I think you've got rationality and intelligence bundled together too tightly, though I agree that there are probably thresholds of intelligence needed for particular insights. And I'm pretty sure that one the reasons rationality has a bad rep is the unnecessary " but you aren't smart enough to play" attitude that sometimes comes with it. There was a recent post (sorry, no time to hunt it down) about how to evaluate new ideas that look cool so that you don't accidentally screw up your life. This should definitely be taught as part of rationality.
0Vaniver13y
Intellectual development doesn't seem to be a matter of time so much as it is man-hours, if that division makes sense. I suspect that if Eliezer was a psychologist and/or educator instead of a computer scientist, we would be looking at a "rationality for the everyman" project instead of SIAI / LessWrong. So, what we need is for someone to take the problem of "how do I explain rationality to actual people with limited time" and and work at it. HP:MoR is a start (it's explaining rationality to a subset of Harry Potter fans, at least) but it's not set up to give the right feedback.

Perhaps the most successful example, Pick-up artists, are out of limits for this community because their terminals are deemed offensive.

"out of limits for this community"? Huh? PUA fans are a significant and vocal part of this community. Sure, they receive some criticism, but so do cryonics advocates, utilitarians, believers in the 'scary idea', and one-boxers. There is no consensus on terminal values here, nor even, beyond a vague Bayesianism, on the algorithms and methods of rationality. Only an agreement that such things are important ... (read more)

0Alexandros13y
I recall there was a PUA-related fuss early in the site's life and the outcome was for articles to be discouraged (though not outright banned). Can anyone confirm or deny this?
1Perplexed13y
I'm relatively new here, so I can neither confirm nor deny. But it seems that you are saying that practical articles like this one are discouraged. Why would anyone discourage that? :)
9Larks13y
There was, in the Summer of '09. This post is quite good, as Alicorn presents a good summary, and links to other prominent top-level posts. Essentially, a lot of words were spake in anger, and PUA seemed to be the mind-killer. Eventually, everyone dropped the topic, but I'm not sure anyone really won. If you'll forgive the crass labels, the anti-PUA side semi-banished PUA from LW, but the pro-PUA side ensured that a lot of people still find it useful/etc., and simply decline from discussing it out of concern for the peace.
4Perplexed13y
Thanks.

We all enjoy beating up on the silly evangelical Christians, but that is dangerous. Let's try to be just a bit more charitable.

What is the narrative that an evangelical Christian buys into regarding their own status? [...] They are taking part in a battle with absolute evil, that represents everything disgusting and despicable, which is manifested in the various difficulties they face in their lives. [...] This presents a black-white divide in which moral judgements are easy.

Christians view their status as sinful. There isn't (usually) some batt... (read more)

5Vaniver13y
I think most people make a division between "Christianity done normally" and "Christianity done well", just like one can make that division for rationality. I agree with you that they should be explicit that they're talking about the stereotype of "standard" Christians instead of "correct" Christians. Because when you look at "standard" Christians, the "we're all sinners" is generally used as an excuse, not a motivation. "Hey, you can't expect me to be perfect!" Instead of actually improving, you just have to want to improve. Indeed, I might even separate Christianity done well into "Christian rationalism" or something similar, because the similarities are rather strong.
1Desrtopa13y
If we're talking about evangelical Christians, the prevailing view is that they are sinful, but forgiven. They believe that they're not perfect, but that their imperfections have already been excused. This gives us two points on which rationality fails to be as appealing. First, evangelical Christians don't have to doubt their understanding, they believe they know what it would look like if they were perfect, although they lack the fortitude to achieve it. Second, they have a forgiving entity to appeal to when they get things wrong.

I see two routes here: ... We focus our recruiting efforts on the upper strata of intelligence and influence ... the common person can keep believing in creationism, and we'll save humanity for them anyway ...

On the other hand ... explore the open avenues that may lead to a more palatable narrative, including popularising many of the rationality themes

For a long time I have been thinking that both of these options are so awful that we should think very hard about third alternatives before we analyse a fixed set of solutions.

5Roko13y
Just thinking out loud: it would be really nice if you could sort of "abandon the idiots to live in the hell that their irrationality creates", and have a country that was just high-IQ types (speaking of which, my IQ is less than 130, and I think most here agree that I did, in fact, understand the sequences). Rationality correlates with IQ (though weakly), but society seems to be very much a nonlinear aggregator of individual traits. I don't know how plausible it is to create a high-IQ high rationality country. You would want to start with a country that was already on that track, e.g. Japan, Israel, South Korea. EDIT: this is probably an even more awful idea than the two presented in the post. I just wanted to open the floor to third alternatives.
3Carinthium13y
The most plausible route (to the extent one is possible) to that conclusion would be to buy a couple of islands for a select elite, then expand from there.
2Roko13y
I heard that the seasteading folk have looked into that already and apparently governments are loth to give up actual sovereignty, though of course they will happily take your $ in exchange for "ownership", where "ownership" means that the island is still subject to their laws. My "suggestion" isn't really very good, I really just made it to make clear what kind of thing a third alternative would look like.
0Carinthium13y
I know that- I said it was the most plausible. A government in severe financial trouble and willing to sell off a minor island is more likely than a sucessful revolt or persuading people to accept a rationalist government.
2jaimeastorga200013y
All of these options are assuming the need to be tied to land, where present governments and populations hold sway and pose difficulties. However, there is much open and unowned space at sea. Even if we ignore future projects for self-sustaining cities floating in the oceans and what-not, the basic idea of a community of people living on ships which spend most of their time in open water strikes me as a relatively plausible solution (i.e. on about the same level as the other proposals). It seems to me like The World is a useful proof of concept here. Now, if a bunch of ships could sail together with a population large enough that it might be considered a moving country...
4Roko13y
The problem I have with this, and my objection to seasteading in general, is that living at sea is really difficult. So difficult that I think the disadvantage levied by seasteading will swamp the advantage(s) of whatever else you are trying to do, e.g. rationality, libertarianism, etc etc. And if, at some stage, seasteading becomes manage-ably difficult, then everyone will quickly get in on it,and you could squeezed out of the game by existing nations who have big navies and want your sea-space. If you think that existing nations wouldn't stamp on you and take all the sea-space, then why do you think they are so loth to sell and give up sovereignty over even the worst and most useless pieces of land that they own?
1XiXiDu13y
Isn't the only person able to judge this the person who wrote the sequences? Or at least someone who was told that he/she understands the sequences by the one who wrote them. Humans have a tendency for self-inflicted malaise. Humans value the freedom to do what they want more than doing what they would want to do from a retrospective alternative point of view that values their extrapolated volition. But why would it be rational to choose the future over the present? Hell is that which you don't want at present not what you might not want if you were a different person.
0Roko13y
It isn't rational to choose the future over the present. It isn't irrational either. Time discount is a free parameter to be chosen according to axiology. At least it would be nice if all the low-time discount people could get together in one place, and leave all the high-discount people together.
1Desrtopa13y
At the risk of causing this to devolve into a discussion of politics, I'm not clear on how Japan, Israel, or South Korea are supposed to look like good startup points in this regard.
0Roko13y
The idea was just that they are more than averagely interested in tech, and israel is also higher than average IQ because of the ashkenazi jews.
1Desrtopa13y
These qualities don't seem to have allowed any of them to run their countries exceptionally well. If you want to construct a high rationality country, I don't think that any of these three would be particularly good starting points. Japan at least manages to rank highly on many societal health indices, but their educational system tends to emphasize rote memorization and prestigious affiliation over original thinking and personal achievement. Between countries, differences in mean intelligence are almost certainly going to be trivial in comparison with differences in social values. You'd want to look, not at which countries tend to have smarter people, but which promote more rationalist-friendly values.
0Roko13y
Any ideas for that?
2Desrtopa13y
My off the cuff answer is that you might want to try looking at various Scandinavian countries, but since I don't think the basic idea is particularly plausible, it's not something I've invested a great deal of consideration in.
0juliawise13y
Scandinavia has a problematic culture of humility: Jante Law
3Emile13y
Start a religion you don't really believe in but that pushes humanity in the right direction.

I agree with you that the "you must be this smart to ride" signs in LW comments are problematic. I'm not sure they're wrong, though. I've never taken an IQ test, but from other tests am at least 98th percentile (which is ~130). Most of the stuff I've read on LW I either agree with naturally or have thought out reasons why I disagree with it- but a lot of that comes from my reflectivity and speed of thought / reading. So LW can be a hobby for me, whereas it would be a massive time investment for someone else.

And, honestly, when I think about "... (read more)

Minor proofreading correction: Second to last para: People show interest(ed)

I can't help thinking that route 1 dooms us. Since the planet isn't run by the intellectual elite. Indeed I'm fairly sure I read a study recently which showed the intellectual elite shy away from politics - not becuase they want to get involved but because it is now more about who can make the mud stick than it is about issues. That would mean the world would be steered by the sub-90 IQs rather than the 130-plus IQs. So "popular" support is necessary, to ensure that rational solutions actually get to see the light of day!

6Emile13y
I would be very surprised if that was actually the case.
3orthonormal13y
I'd heard, rather, that your average national-level politician is smart but not too smart- between 120 and 140, typically. There's indirect selection for intelligence in campaigns. But yes, there does seem to be some kind of selection against politics for the very smart.
0Alexandros13y
Thanks, typo fixed.