In this community, agreeing with a poster such as yourself signals me as sycophantic and weak-minded; disagreement signals my independence and courage. There's also a sense that "there are leaders and followers in this world, and obviously just getting behind the program is no task for so great a mind as mine".
However, that's not the only reason I might hesitate to post my agreement; I might prefer only to post when I have something to add, which would more usually be disagreement. Since I don't only vote up things I agree with, perhaps I should start hacking on the feature that allows you to say "6 members marked their broad agreement with this point (click for list of members)".
However, that's not the only reason I might hesitate to post my agreement; I might prefer only to post when I have something to add, which would more usually be disagreement. Since I don't only vote up things I agree with, perhaps I should start hacking on the feature that allows you to say "6 members marked their broad agreement with this point (click for list of members)".
That would be great.
This is a good point, but I think there's a ready solution to that. Agreement and disagreement, by themselves, are rather superficial. Arguments, on the other hand, rationalists have more respect for. When you agree with someone, it seems that you don't have the burden to formulate an argument because, implicitly, you're referring to the first person's argument. But when you disagree with someone, you do have the burden of formulating a counterargument. So I think this is why rationalists tend to have more respect for disagreement than agreement, because disagreement requires an argument, whereas agreement doesn't need to.
But on reflection, this arrangement is fallacious. Why shouldn't agreement also require an argument? I think it may seem to add to the strength of an argument if multiple people agree that it is sound, but I don't think it does in reality. If multiple people develop the same argument independently, then the argument might be somewhat stronger; but clearly this isn't the kind of agreement we're talking about here. If I make an argument, you read my argument, and then you agree that my argument is sound, you haven't developed the same argument independently...
If they post just a "Amazing post, as usual Eliezer" without further informative contribution, then I too get this mild sense of "sucking up" going on.
Actually, this whole blog (as well as Overcoming Bias) does have this subtle aura of "Eliezer is the rationality God that we should all worship". I don't blame EY for this; more probably, people are just naturally (evolutionarily?) inclined to religious behaviour, and if you hang around LW and OB, then you might project towards the person who acts like the alpha-male of the pack. In fact, it might not even need to have any religious undertones to it. It could just be "alpha-male mammalian evolution society" stuff.
Eliezer is a very smart person. Certainly much smarter than me. But so is Robin Hanson. (I won't get into which one is "smarter", as they are both at least two levels above me) and I feel he is often-- "under-appreciated" perhaps is the closest word?-- perhaps because he doesn't posts as often, but perhaps also because people tend to "me too" Eliezer a lot more often than they "me too" Robin (but again this might be because EY posts much more frequently than RH).
It's simpler than that: 1) Eliezer expresses certainty more often than Robin, and 2) he self-discloses to a greater degree. The combination of the two induces tendency to identification and aspiration. (The evolutionary reasons for this are left as an exercise for the reader.)
Please note that this isn't a denigration -- I do exactly the same things in my own writing, and I also identify with and admire Eliezer. Just knowing what causes it doesn't make the effect go away.
(To a certain extent, it's just audience-selection -- expressing your opinions and personality clearly will make people who agree/like what they hear become followers, those who disagree/dislike become trolls, and those who don't care one way or the other just go away altogether. NOT expressing these things clearly, on the other hand, produces less emotion either way. I love the information I get from Robin's posts, but they don't cause me to feel the same degree of personal connection to their author.)
The nice thing about karma/voting sites like this one is that they provide an efficient and socially acceptable mechanism for signaling agreement: just hit the upmod button. Nobody wants to read or listen to page after page of "me too"; forcing people to tolerate this would be bad enough to negate the advantage of making agreement visible. Voting accomplishes the same visibility without the irritating side-effects.
There's a bit of noise, as I sometimes vote up someone I disagree with if they raise an interesting point, and I very, very rarely vote someone down just because I disagree with them.
This "bit of noise" becomes significant on sites with a small number of subscribers, as a +/-2 vote is a "big deal".
I must admit, I think I do find myself going into Vulcan mode when posting on LW. I find myself censoring very simple social cues -- expressions of gratitude, agreement, emotion -- because I imagine them being taken for noise. I think I'm going to make an effort to snap myself out of this.
Same here. It's very natural for me to thank people when they say or do something awesome, to encourage promising newbies, and to express my agreement when I do agree, but I got the impression that such things are generally frowned upon here, so I found myself suppressing them.
Actually, I didn't mind that much -- the power of ideas discussed here way outweighs these social inconveniences, and I can easily live with that. But personally, I would prefer to be able to express my agreement and gratitude without spending too much calories on worrying about my tribal status.
(Of course we'll need to keep the signal/noise ratio in check, but I'll post my ideas on that in a separate comment).
Two thoughts.
For example, in a community where I have influence, I expect demonstrating explicit support to push community norms towards explicit support, and demonstrating criticism to push norms towards criticism.
This creates the admittedly frustrating situation where, if a community is too critical and insufficiently supportive, it is counterproductive for me to criticize that. That just models criticism, which gets me more criticism; the more compelling and powerful my criticism, the more criticism I'll get in return.
If a community is too critical and insufficiently supportive, I do better to model agreement as visibly and as consistently as I can, and to avoid modeling criticism. For example, to criticize people privately and support them publicly.
If a community is too critical and insufficiently supportive, I do well to be actively on the lookout for others' supportive contributions and to reward them (for example: by praising them, by calling other people's attention to them, and/or by paying attention to them myself). I similarly do well to withhold those rewards from critical contributions.
Heh, it seems like this post has primed me for agreement, and I upvoted a lot more comments than I usually do. And it looks like many others did this as well -- look at the upvote counts! I was reading and voting with Kibitzer on, and was surprised to see the numbers.
(Have I just lowered my status by signaling that I'm susceptible to priming?)
Nah, you've raised it, by signaling that you're honest. At least, that's how it would work among true rationalists (as opposed to anti-irrationalists). ;-)
This article seems to model rational discourse as a cybernetic system made of two opposite actions that need to be balanced:
Agreement and disagreement are not basic elements of a statement about base reality, they're contextual facts about the relation of your belief to others' beliefs. Is "the sky is blue" agreement or dissent? Depends on what other people are saying. If they're saying it's blue, it's agreement. If they're saying it's green, it's dissent. Someone might disagree with someone by supporting an action, or agree with a criticism of what was previously a shared story. When you have a specific belief about the world, that belief is not made of disagreement or agreement with others, it's made of constrained conditional anticipations about your observations.
This error seems likely related to using a synagogue fundraiser as the central case of a shared commitment of resources, rather than something like an assurance contract! There's a very obvious antirational motive for synagogue fundraisers not to welcome criticism - God is made up, and a community organized around the things its members would genuinely like to...
Many points that are both new and good. Like prase, and like a selection of other fine LW-ers with whom I hope to be agreeing soon, I think your post is awesome :)
One root of the agreement/disagreement asymmetry is perhaps that many of us aspiring rationalists are intellectual show-offs, and we want our points to show everyone how smart we are. Status feels zero-sum, as though one gains smart-points from poking holes in others' claims and loses smart-points from affirming others' good ideas. Maybe we should brainstorm some schemas for expressing agreement while adding intellectual content and showing our own smarts, like "I think your point on slide 14 is awesome. And I bet it can be extended to new context __", or "I love the analogy you made on page 5; now that I read it, I see how to take my own research farther..."
Related: maybe we feel self-conscious about speaking if we don't have anything "new" to add to the conversation, and we don't notice "I, too, agree" as something new. One approach here would be to voice, not just agreement, but the analysis that's going into each individual's agreement, e.g. "I agree; that sounds just ...
“If I agree, why should I bother saying it? Doesn’t my silence signal agreement enough?”
That’s been my non-verbal reasoning for years now! Not just here: everywhere. People have been telling me, with various degrees of success, that I never even speak except to argue. To those who have been successful in getting through to me, I would respond with, “Maybe it sounds like I’m arguing, but you’re WRONG. I’m not arguing!”
Until I read this post, I wasn’t even aware that I was doing it. Yikes!
BRAVO, Eliezer! Huuzah! It's about time!
I don't know if you have succeeded in becoming a full rationalist, but I know I haven't! I keep being surprised / appalled / amused at my own behavior. Intelligence is way overrated! Rationalism is my goal, but I'm built on evolved wet ware that is often in control. Sometimes my conscious, chooses-to-be-rationalist mind is found to be in the kiddy seat with the toy steering wheel.
I haven't been publicly talking about my contributions to the Singularity Institute and others fighting to save us from ourselves. Part of that originates in my father's attitude that it is improper to brag.
I now publicly announce that I have donated at least $11,000 to the Singularity Institute and its projects over the last year. I spend ~25 hours per week on saving humanity from Homo Sapiens.
I say that to invite others to JOIN IN. Give humanity a BIG term in your utility function. Extinction is Forever. Extinction is for ... us?
Thank you, Eliezer! Once again, you've shown me a blind spot, a bias, an area where I can now be less wrong than I was.
With respect and high regard,
Rick Schwall, Ph.D.
Saving Humanity from Homo Sapiens™ :-|
What do you recommend I do about my preachy style?
I suggest trying to determine your true confidence on each statement you write, and use the appropriate language to convey the amount of uncertainty you have about its truth.
If you receive feedback that indicates that your confidence (or apparent confidence) is calibrated too high or too low, then adjust your calibration. Don't just issue a blanket disclaimer like "All of that is IN MY OPINION."
Two observations:
In American culture, when you give money to a charity, you aren't supposed to tell people. Christian doctrine frowns heavily on that, and we are all partly indoctrinated with that doctrine. That's why no one sent their "yes" response to the list.
You just wrote a post with 22 web links, and 19 of them were to your own writings. I think that says more about why we can't cooperate than anything else in the post.
Far from being a negative aspect of the post, the self-linking is a key element of Eliezer's effort to build a common vocabulary for rationalists. I've personally found them extremely helpful for reminding myself of the context of the words, when I've forgotten. They're basically footnotes.
How can we cooperate if we don't even speak the same language?
First let me say that I do not think that attacks are by their very nature impermissible, and if you do, how dare you put "witty" in scare quotes? That's just flat-out unkind.
Anyway, it's a little hard for me to defend my comments of two years ago against attack, because I no longer remember what prompted me to make them. I will do my best to reconstruct my mental state leading up to the comment I made.
I don't think I was necessarily on PhilGoetz's side when I read his comment. I think I agreed, and still agree, with Technologos. But when I read the Wise Master's response to it, it didn't sit right with me. It read like an attempt to fight back against attack with anything that came to hand, rather than an attempt to seek truth. Surely, I must have felt, if the Wise Master were thinking clearly, he would see that unfamiliarity with the works of others is not an excuse, but in fact the entire problem. I feel that I wanted to communicate this insight. I chose the form that I did probably because it was the first one that came to mind. I hang out on some pretty rough and tumble internet forums, described by one disgruntled former poster as "geek bevis[sic] and ...
In hindsight, the problem with your fundraiser was obvious. There were two communications channels: one private channel for people who contributed, and one channel for everyone else. Very few people will post a second message after they've already posted one, so the existence of the private channel prevented contributors from posting on the mailing list. Removing all the contributors from the public channel left only nay-sayers and an environment that favored further nay-saying. The fix would be to merge the two channels: publish the messages received from contributors, unless they request otherwise.
I agree with everything you said in your talk, and I think you're brilliant.
I've noticed that I am often hesitant to publicly agree with comments and posts here on LessWrong because often agreement will be seen as spam. While upvotes do count as something, it is far easier to post a disagreement than to invent an excuse to post something that mostly agrees. This can be habit forming.
Comparing say Less Wrong with a Mensa online discussion group I've noticed that my probaility of disagreement is far lower with the self identified rationalists than with the self and test identified generic smart people. The levels of Dark Side Argument are almost incomparable. I have begun disengaging from Dark debates wherever convenient purely to form better habits at agreement.
In fact, agreement is a sort of spam - it consumes space and usually doesn't bring new thoughts. When I imagine a typical conference where the participants are constantly running out of time, visualising the 5-minute question interval consumed by praise to the speaker helps me a lot in rationalising why the disagreement culture is necessary. Not that it would be the real reason why I would flee screaming out of the room, I would probably do even if the time wasn't a problem.
When I read the debates at e.g. daylightatheism.org I am often disgusted by how much agreement there is (and it is definitely not a Dark Side blog). So I think I am strongly immersed in the disagreement culture. But, all cultural prejudices aside, I will probably always find a discussion consisting of "you are brilliant" type statements extraordinarily boring.
It doesn't have to bring new thoughts to serve a purpose. A chorus of agreement is an emotional amplifier.
I'm going to agree with the people saying that agreement often has little to no useful information content (the irony is acknowledged). Note, for instance, that content-free "Me too!" posts have been socially contraindicated on the internet since time immemorial, and content-free disagreement is also generally verboten. This also explains the conference example, I expect. Significantly, if this is actually the root of the issue, we don't want to fight it. Informational content is a good thing. However, we may need to find ways to counteract the negative effects.
Personally, having been somewhat aware of this phenomenon, when I've agreed with what someone said I sometimes try to contribute something positive; a possible elaboration on one of their points, a clarification of an off-hand example if it's something I know well, an attempt to extend their argument to other territory, &c.
In cases like the fundraising one, where the problem is more individual misperception of group trends, we probably want something like an anonymous poll--i.e., "Eliezer needs your help to fund his new organization to encourage artistic expression from rationalists. Would you donate money to this cause?", with a poll and a link to a donation page. I would expect you'd actually get a slightly higher percentage voting "yes" than actually donating, though I don't know if that would be a problem. You'd still get the same 90% negative responses, but people would also see that maybe 60% said they would donate.
"A slightly higher percentage"? More like: no correlation.
I recall that McDonalds were badly burned by "would you X". Would people buy salads? oh god yes, they'd love an opportunity to eat out and stick to their diets. Did they buy salads, once McDonalds had added them? Nope.
Similarly I recall that last US election the Ron Paul Blimp campaign was able to get a lot more chartable pledges than real-world money, and pretty quickly died from underfunding.
I've worked for a number of non profits and in analysis of our direct mailings, we would get a better response from a mailing that included one of two things
This is one of the reasons that some types of nonprofits choose to create levels of giving; my guess is that it is gaming these common level of giving ideas by creating artificial norms of participation. Note You can base your levels on actual evidence and not just round numbers! (plus inflation, right?)
We also generally found that people respond well to the idea of a matching donation (which is rational since your gift is now worth more).
I do believe that anonymous fund raising removes information about community participation that is very valuable to potential donors. Part of making a donation is responding the signal that you are not the only one sending a check to a hopeless office somewhere.
Anonymous polls might be a good idea, but especially among rational types, you might want the individual testimony: you get to see some of the reasoning!
I think the synagogue in the story picked up on these ideas and used them effectively. But the nice thing about raising money through direct mailing and the internet is that you can run experiments!
To be honest, I suspect a lot of those folks, and I include myself here, were anti-collectivists first.
In my own mind, the emotive rule "I might follow, but I must never obey" is built over a long childhood war and an eventual hard-fought and somewhat Pyrrhic victory. I know it's reversed stupidity, but it's hard to let go.
What good rationalist techniques are there for changing such things?
Ask "what's bad about obeying?" Imagine a specific concrete instance of obeying, and then carefully observe your automatic, unconscious response. What bad thing do you expect is going to happen?
Most likely, you will get a response that says something about who you are as a person: your social image, like, "then I'll be weak". You can then ask how you learned that obeying makes someone weak... which may be an experience like your peers teasing you (or someone else) for obeying. You can then rationally examine that experience and determine whether you still think you have valid evidence for reaching that conclusion about obedience.
Please note, however, that you cannot kill an emotional decision like this without actually examining your own evidence for the proposition, as well as against it. The mere knowledge that your rule is irrational is not sufficient to modify it. You need to access (and re-assess) the actual memor(ies) the rule is based on.
Recognizing that "I might follow, but I must never obey" is an emotional rule is already a good first step, much better than trying to rationalize it.
I've recognized that same pattern in myself - a bad feeling in response to the idea of following / obeying even when it's an objectively good idea to do so. I imagined an "asshole with a time machine" who would follow me around, observe what I did (buy a ham sandwitch for lunch, enter a book store...), go back in time a few seconds before my decision and order me to do it.
Once I realized I was much more angry against this hypothetical asshole than it was reasonable to, I tried getting rid of that anger. I guess I succeeded (the idea doesn't bug me as much), but I don't know if it means I won't have any more psychological resistance to obeying. I am probably still pretty biased towards individualism / giving more value to my opinion just because it's my own, but I'd like to find ways to get rid of that..
Wait a second, now we're using Jews trying to run a synagogue as an example of a group who cooperate and don't always disagree with each other for the sake of disagreeing? Your synagogue must have been very different from mine. You never heard the old "Ten Jews, ten opinions - or twenty if they're Reform" joke? Or the desert island joke?
I also agree with everyone. In particular, I agree with Cameron and Prase that it's tough to just say "I agree". I agree with ciphergoth that I worry that I'm sucking up to you too much. I agree with Anna Salamon that we tend to be intellectual show-offs. I agree with Julian that many of us probably started off with a contrarian streak and then became rationalists. I agree with Jacob Lyles that there's a strong game theory element here - I lose big if rationalists don't cooperate, I win a little if we all cooperate under Eliezer's benevolent leadership, but to a certain way of thinking I win even more if we all cooperate under my benevolent leadership and there's no universally convincing proof that cooperating under someone else is always the highest utility option. And I agree with practically everything in the main post.
One thing I don't agree with: being ashamed of strong feelings isn't a specifically rationalist problem. It's a broader problem with upper/middle class society. Possibly more on this later.
I personally see public disagreements as a way to refine the intent of the person under the spotlight rather than a social display of individualism. When I disagree with someone it is not for the sake of disagreeing but rather to refine what I may think is a good idea that has a few weak points. I do this to those I respect and agree with because I hope that others will do this to me.
I think the broader question here is not whether we should encourage widespread agreement in order to create cohesion - but rather if we can ensure that the tenets we collecti...
On 'What Do We Mean By "Rationality"?' when you said "If that seems like a perfectly good definition, you can stop reading here; otherwise continue." - I took your word for it and stopped reading. But apparently comments aren't enabled there.
You have significantly altered my views on morality (Views which I put a GREAT deal of mental and emotional effort into.) I suspect I am not alone in this.
I think there's a fine line between tolerating the appearance of a fanboy culture, and becoming a fanboy culture. The next rationalist pop star ...
"[A] survey of 186 societies found, belief in a moralising God is indeed correlated with measures of group cohesion and size." - God as Cosmic CCTV, Dan Jones
I'm not sure if this was at work in your fundraiser, but I know I tend to see exhortations from others that I give to charitable causes/nonprofits as attempts at guilt tripping. (I react the same way when I'm instructed to vote, or brush my teeth twice a day, or anything else that sounds less like new information and more like a self-righteous command.) For this reason, I try to keep quiet when I'm tempted to encourage others to give to my pet charity/donate blood/whatever, for fear that I'll inspire the opposite reaction and hurt my goal. I don't always succeed, but that's an explanation other than a culture of disagreement for why some people might not have contributed to the discussion from a pro-giving position.
Good points.
This may be why very smart folks often find themselves unable to commit to an actual view on disputed topics, despite being better informed than most of those who do take sides. When attending to informed debates, we hear a chorus of disagreement, but very little overt agreement. And we are wired to conduct a head count of proponents and opponents before deciding whether an idea is credible. Someone who can see the flaws in the popular arguments, and who sees lots of unpopular expert ideas but few ideas that informed people agree on, may giv...
There is no guarantee of a benevolent world, Eliezer. There is no guarantee that what is true is also beneficial. There is no guarantee that what is beneficial for an individual is also beneficial for a group.
You conflate many things here. You conflate what is true with what is right and what is beneficial. You assume that these sets are identical, or at least largely overlapping. However, unless a galactic overlord designed the universe to please homo sapien rationalists, I don't see any compelling rational reason to believe this to be the case.
Irration...
I one-box on Newcomb's Problem, cooperate in the Prisoner's Dilemma against a similar decision system, and even if neither of these were the case: life is iterated and it is not hard to think of enforcement mechanisms, and human utility functions have terms in them for other humans. You conflate rationality with selfishness, assume rationalists cannot build group coordination mechanisms, and toss in a bit of group selection to boot. These and the referenced links complete my disagreement.
I completely agree with this post. It's heartwarmingly and mindnumbingly agreeable, I would like to praise it and applaud it forever and ever. On a more serious note, personally it feels like not contributing anything into the conversation if you're just agreeing. Like for an example if I read a 100 posts in here, I don't feel compelled to add a comment which says just "I agree." to each of them because it feels like it doesn't add to the substance of the issue. - So I'm totally doing what the post predicts.
I have really read a hundred or so post...
On the other hand, if you are only half-a-rationalist, you can easily do worse with more knowledge. I recall a lovely experiment which showed that politically opinionated students with more knowledge of the issues reacted less to incongruent evidence, because they had more ammunition with which to counter-argue only incongruent evidence.
What exactly is the problem with this? The more knowledge I have, the smaller a weighting I place on any new piece of data.
You're awesome, Eli. I love the mix of rationality and emotion here. Emotion is a powerful tool for motivating people. We of the Light Side are rightfully uncomfortable with its power to manipulate, but that doesn't mean we have to abandon it completely.
I recently suggested a rationality "cult" where the group affirmation and belonging exercise is to circle up and have each person in turn say something they disagree with about the tenets of the group. Then everyone cheers and applauds, giving positive feedback. But now I see that this is goi...
I think there's an interesting moral of the anecdote, but I'm not sure it's the one you expressed.
My conclusion is: rationalists who desire to discard the burdensome yoke of their cultural traditions, linked inextricably as they are to religion, will have to relearn an entirely new set of cultural traditions from scratch. For example, they will need to learn a new mechanism design that allows them to cooperate in donating money to cause that is accepted as being worthwhile (I think the "ask for money and then wait for people to call out contributions" scheme is damned brilliant).
As the old joke says: What do you mean 'we', white man?
The real reason ostensibly smart people can't seem to cooperate is that most of them have no experience with reaching actual conclusions. We train people to make whatever position they espouse look good, not to choose positions well.
Perhaps a way to have comments of agreement that can also work as signalling your own smarts would be to say that you agree, and that the best part/most persuasive part/most useful part is X while providing reasons why.
Isnt the secret power of Rationality that it can stand up to review? Religious cults are able to demand extreme loyalty because the people are not presented alternatives and are not able to question the view they are handed. One of our strengths seems to be in discernment and argumentation which naturally leads to fractious in-fighting. What would we call "withholding criticism for the Greater Good"?
"But if you tolerate only disagreement - if you tolerate disagreement but not agreement - then you also are not rational. You're only willing to hear some honest thoughts, but not others. You are a dangerous half-a-rationalist."
To point in the rough direction of an empirical cluster in personspace. If you understood the phrase "empirical cluster in personspace" then you know who I'm talking about.
If someone understands the phrase "empirical cluster in personspace," they probably are who you're talking about. =)
This is very interesting; I have usually refrained from replying because I could not think of anything to say that wasn't trivial. Will take care to voice agreement in th future where applicable.
But none of those donors posted their agreement to the mailing list. Not one.
Couldn't you just ask contributors for the right to make their donations public?
Then clearly your fund-raising drive would have benefited from a mechanism for publicizing and externalizing support.
Charitable organizations commonly use a variety of such methods. The example you gave is just one. If correctly designed the mechanisms do not cause support to be swamped by criticism, and they can operate without suppressing any free thought or speech.
E.g. publishing (with their agreement) the names of donors, the amounts, and endorsements; using that information to solicit from other donors; getting endorsements from respected peo...
To some extent, this was discussed in "The Starfish and the Spider", which is about "leaderless groups". The book praises the power of decentralized, individualistic cultures (that you describe as "Light Side"). However, it admits that they're slower and less-well coordinated than hierarchical organizations (like the military, or some corporations).
You've outlined some of the benefits (recruitment, coordinated action) of encouraging public agreement and identifying with the group. You've also outlined some of the dangers (plur...
I have been thinking about this subject for a while because I saw the same type of culture of disagreement prevent a group I was a member of from doing anything worthwhile. The problem is very interesting to me because I come from the opposite side of the spectrum being heavily collectivist. I take pleasure in conforming to a group opinion and being a follower but I also have nurtured a growing rationalist position for the last few years. So despite my love of being a follower I often find myself aspiring to a leadership position in order to weld my favore...
"Those who had nothing to give, stayed silent; those who had objections, chose some later or earlier time to voice them. That's probably about the way things should be in a sane human community"
Personally I think that you were speaking to the wrong crowd when trying to fund raise. Or perhaps I should say too wide a crowd. Like trying to fundraise for tokamak fusion in a mailing list where people are interested in fusion in the generality. People who don't believe that tokamaks will ever be stable/usable are duty bound to try and convince the ot...
There's an easy and obvious coordination mechanism for rationalists, which is just to say they're building X from science fiction book Y, and then people will back them to the hilt, as long as their reputation and track record for building things without hurting people is solid. Celebrated Book Y is trusted to explain the upsides and downsides of thing X, and people are trusted to have read the book and have the Right Opinions about all the tradeoffs and choices that come with thing X.
So really, it all comes down to the thing that actually powers the...
I have a modest amount of pair programming/swarming experience, and there are some lessons I have learned from studying those techniques that seem relevant here:
I agree. I don't often say I agree for efficiency. You've made the point more eloquently than I could and my few sentences in support of you would probably strengthen your point socially, but it wouldn't improve the argument in some logical sense.
I love signaling agreement when I can do it and be just as eloquent as the writing I'm agreeing with. Famous authors put a lot of work into the blurbs they write recommending their friend's books. And that work shows. "X is a great summertime romp, full of adventure!" sure is a glowing recommendation, but it's not...
Our culture puts all the emphasis on heroic disagreement and heroic defiance, and none on heroic agreement or heroic group consensus.
There's a lot more of this in anime, I feel. A lot of characters end up trusting someone from the bottom of their hearts, agreeing to follow their vision to the end, and you see whole group of good guys that are wholeheartedly committed and united to the same idea. Even main characters often show this trait toward others.
"Yes, a group which can't tolerate disagreement is not rational. But if you tolerate only disagreement—if you tolerate disagreement but not agreement—then you also are not rational". Well, agreement may just be perceived default. If I sit at a talk and find nothing to say about (and, mind you, that happens R. A. R. E. L. Y) it means either that I totally agree or that it is so wrong I don't know where to begin.
Also, your attitude on "we are not to win arguments, we are to win", your explicit rejection of rhetorics (up to the ...
Wow. I don't identify as a cynic or spock, but of the many articles I have read on Less Wrong since I discovered it yesterday, this one is perhaps the most perspective changing.
It makes me happy that those traits you list as what rationalists are usually thought of ----disagreeable, unemotional, cynacal, loners---are unfamiliar. The rationalists I have grown up in the past few years reading this site are both optimistic and caring, along with many other qualities.
Eliezer, I applaud your post. Bravo. I agree.
I'm new to this site and I was compelled to sign up immediately.
There's not much to add here, but that I hope people appreciate the significance of not shutting off all emotions, much like you argue in this post.
Those who suspect me of advocating my unconventional moral position to signal my edgy innovativeness or my nonconformity should consider that I have held the position since 1992, but only since 2007 have I posted about it or discussed it with anyone but a handful of friends.
People are also unwilling to express agreement because they know, and fear, group consensus and the pressure to fit in. Those usually lead to groupspeak and groupthink.
Given that one of the primary messages of the local Powers That Be is that other people's evaluations should be a factor in your own - that other people's conclusions should be considered as evidence when you try to conclude - and that's incompatible with effective rationality, as well as the techniques needed to prevent self-reinforcing mob consensus.
Not only the culture of disagreement takes place. When I see "+1", I think what a mind processes do that: commenter needs some attention but have nothing to say? And so when I want to post "+1", I do not do that, for someone didn't think the same about me. Usually I'm trying to make some complement to original post, or little correction to it with clear approval of the rest. Something not important and, at the same time, not just "+1".
There is a way to solve this problem, but it dangerous. Rationalist can watch discussion clos...
I wonder if one person can have a big effect on this sort of thing.
For example, I've known charity organizers to publish the number of donors and the total money donated every few days. Even without identifying donors, that does a lot to make people feel less alone.
An alternate explanation: I've noticed a trend where rationalists seem more likely to criticize ideas in general. Perhaps a key experience that needs to happen before some people choose to undergo the rigors of becoming a rationalist is a "waking up" after some trauma that makes them err on the side of being paranoid. I have observed that most people without a "wake up" trauma prefer to simply retain optimism bias and tend to conserve thinking resources for other uses. Someone who thinks as much as you do probably does not feel a ne...
organizing atheists has been compared to herding cats, because they tend to think independently and will not conform to authority - The God Delusion
Maybe - but they seem to work together well enough - if you pay them.
Rather than ourselves making the drastic cultural changes that Eli talks about, perhaps it would be more efficient to piggyback on to another movement which is further down that path of culture change, so long as that movement isn't irrational. See this URL:
http://www.thankgodforevolution.com/node/1711
Check out the rest of the web site if you have time, or better yet, buy and read the book the web site is promoting. As you can see from the URL above, cooperation is an important value in the group.
I have been observing the spiritual practices promoted by ...
Hrm, overall makes sense. But now, HOW do you suggest, for something here, an online forum, actually doing that sort of thing in the general case without it translating to a whole bunch of people going, effectively, "me too"?
I do remember when for a certain unnamed organization you started the "donate today and tomorrow" drive (or whatever you called it, something to that effect), I did post to a certain mailing list my thoughts that both led me to donate and what I was thinking in response to that sort of appeal, etc etc.
In the pursuit of truth it is rational to argue and, at first glance, irrational to agree. The culling of truth proceeds by "leaving be" the material that is correct and modifying (arguing with) the part that is not. (While slightly tangential, it is good to recall that the scientific method can only argue with a hypothesis; never confirm it.)
At a conference where there is a dialogue it is a waste of time to agree, as a lack of argument is already implicit agreement. After the conference, however, the culling of truth further progresses by assi...
I'm a beginner that thinks meta-discussions are fun..
Eliezer is asking about whether we should tolerate tolerance. Let's suppose -- for the sake of argument -- that we do not tolerate tolerance. If X is intolerable, then the tolerance of X is intolerable.
So if Y tolerates X, then Y is intolerable. And so on.
Thus, if we accept that we cannot tolerate toleration, then also we cannot tolerate toleration of tolerance, and also we cannot tolerate toleration of toleration of tolerance.
I would think of tolerance as a relationship between X and Y in which Y acquires the intolerability of X.
I think that there are parts of life where we should learn to applaud strong emotional language, eloquence, and poetry. When there's something that needs doing, poetic appeals help get it done, and, therefore, are themselves to be applauded.
That may be, but I generally find YOUR poetic appeals to make me throw up in my mouth. I read my mother your bit about how amazing it was that love was born out of the cruelty of natural selection, and even she thought it was sappy.
I don't see how individualism can beat out collectivism as long as groups = more power. for individualism to work each person would have to wield equal power to any group.
From when I was still forced to attend, I remember our synagogue's annual fundraising appeal. It was a simple enough format, if I recall correctly. The rabbi and the treasurer talked about the shul's expenses and how vital this annual fundraise was, and then the synagogue's members called out their pledges from their seats.
Straightforward, yes?
Let me tell you about a different annual fundraising appeal. One that I ran, in fact; during the early years of a nonprofit organization that may not be named. One difference was that the appeal was conducted over the Internet. And another difference was that the audience was largely drawn from the atheist/libertarian/technophile/sf-fan/early-adopter/programmer/etc crowd. (To point in the rough direction of an empirical cluster in personspace. If you understood the phrase "empirical cluster in personspace" then you know who I'm talking about.)
I crafted the fundraising appeal with care. By my nature I'm too proud to ask other people for help; but I've gotten over around 60% of that reluctance over the years. The nonprofit needed money and was growing too slowly, so I put some force and poetry into that year's annual appeal. I sent it out to several mailing lists that covered most of our potential support base.
And almost immediately, people started posting to the mailing lists about why they weren't going to donate. Some of them raised basic questions about the nonprofit's philosophy and mission. Others talked about their brilliant ideas for all the other sources that the nonprofit could get funding from, instead of them. (They didn't volunteer to contact any of those sources themselves, they just had ideas for how we could do it.)
Now you might say, "Well, maybe your mission and philosophy did have basic problems—you wouldn't want to censor that discussion, would you?"
Hold on to that thought.
Because people were donating. We started getting donations right away, via Paypal. We even got congratulatory notes saying how the appeal had finally gotten them to start moving. A donation of $111.11 was accompanied by a message saying, "I decided to give **** a little bit more. One more hundred, one more ten, one more single, one more dime, and one more penny. All may not be for one, but this one is trying to be for all."
But none of those donors posted their agreement to the mailing list. Not one.
So far as any of those donors knew, they were alone. And when they tuned in the next day, they discovered not thanks, but arguments for why they shouldn't have donated. The criticisms, the justifications for not donating—only those were displayed proudly in the open.
As though the treasurer had finished his annual appeal, and everyone not making a pledge had proudly stood up to call out justifications for refusing; while those making pledges whispered them quietly, so that no one could hear.
I know someone with a rationalist cause who goes around plaintively asking, "How come the Raelian flying-saucer cult can get tens of thousands of members [probably around 40,000] interested in complete nonsense, but we can't even get a thousand people working on this?"
The obvious wrong way to finish this thought is to say, "Let's do what the Raelians do! Let's add some nonsense to this meme!" For the benefit of those not immediately stopped by their ethical inhibitions, I will observe that there may be a hundred failed flying-saucer cults for every one that becomes famous. And the Dark Side may require non-obvious skills, which you, yes you, do not have: Not everyone can be a Sith Lord. In particular, if you talk about your planned lies on the public Internet, you fail. I'm no master criminal, but even I can tell certain people are not cut out to be crooks.
So it's probably not a good idea to cultivate a sense of violated entitlement at the thought that some other group, who you think ought to be inferior to you, has more money and followers. That path leads to—pardon the expression—the Dark Side.
But it probably does make sense to start asking ourselves some pointed questions, if supposed "rationalists" can't manage to coordinate as well as a flying-saucer cult.
How do things work on the Dark Side?
The respected leader speaks, and there comes a chorus of pure agreement: if there are any who harbor inward doubts, they keep them to themselves. So all the individual members of the audience see this atmosphere of pure agreement, and they feel more confident in the ideas presented—even if they, personally, harbored inward doubts, why, everyone else seems to agree with it.
("Pluralistic ignorance" is the standard label for this.)
If anyone is still unpersuaded after that, they leave the group (or in some places, are executed)—and the remainder are more in agreement, and reinforce each other with less interference.
(I call that "evaporative cooling of groups".)
The ideas themselves, not just the leader, generate unbounded enthusiasm and praise. The halo effect is that perceptions of all positive qualities correlate—e.g. telling subjects about the benefits of a food preservative made them judge it as lower-risk, even though the quantities were logically uncorrelated. This can create a positive feedback effect that makes an idea seem better and better and better, especially if criticism is perceived as traitorous or sinful.
(Which I term the "affective death spiral".)
So these are all examples of strong Dark Side forces that can bind groups together.
And presumably we would not go so far as to dirty our hands with such...
Therefore, as a group, the Light Side will always be divided and weak. Atheists, libertarians, technophiles, nerds, science-fiction fans, scientists, or even non-fundamentalist religions, will never be capable of acting with the fanatic unity that animates radical Islam. Technological advantage can only go so far; your tools can be copied or stolen, and used against you. In the end the Light Side will always lose in any group conflict, and the future inevitably belongs to the Dark.
I think that one's reaction to this prospect says a lot about their attitude towards "rationality".
Some "Clash of Civilizations" writers seem to accept that the Enlightenment is destined to lose out in the long run to radical Islam, and sigh, and shake their heads sadly. I suppose they're trying to signal their cynical sophistication or something.
For myself, I always thought—call me loony—that a true rationalist ought to be effective in the real world.
So I have a problem with the idea that the Dark Side, thanks to their pluralistic ignorance and affective death spirals, will always win because they are better coordinated than us.
You would think, perhaps, that real rationalists ought to be more coordinated? Surely all that unreason must have its disadvantages? That mode can't be optimal, can it?
And if current "rationalist" groups cannot coordinate—if they can't support group projects so well as a single synagogue draws donations from its members—well, I leave it to you to finish that syllogism.
There's a saying I sometimes use: "It is dangerous to be half a rationalist."
For example, I can think of ways to sabotage someone's intelligence by selectively teaching them certain methods of rationality. Suppose you taught someone a long list of logical fallacies and cognitive biases, and trained them to spot those fallacies in biases in other people's arguments. But you are careful to pick those fallacies and biases that are easiest to accuse others of, the most general ones that can easily be misapplied. And you do not warn them to scrutinize arguments they agree with just as hard as they scrutinize incongruent arguments for flaws. So they have acquired a great repertoire of flaws of which to accuse only arguments and arguers who they don't like. This, I suspect, is one of the primary ways that smart people end up stupid. (And note, by the way, that I have just given you another Fully General Counterargument against smart people whose arguments you don't like.)
Similarly, if you wanted to ensure that a group of "rationalists" never accomplished any task requiring more than one person, you could teach them only techniques of individual rationality, without mentioning anything about techniques of coordinated group rationality.
I'll write more later (tomorrow?) on how I think rationalists might be able to coordinate better. But today I want to focus on what you might call the culture of disagreement, or even, the culture of objections, which is one of the two major forces preventing the atheist/libertarian/technophile crowd from coordinating.
Imagine that you're at a conference, and the speaker gives a 30-minute talk. Afterward, people line up at the microphones for questions. The first questioner objects to the graph used in slide 14 using a logarithmic scale; he quotes Tufte on The Visual Display of Quantitative Information. The second questioner disputes a claim made in slide 3. The third questioner suggests an alternative hypothesis that seems to explain the same data...
Perfectly normal, right? Now imagine that you're at a conference, and the speaker gives a 30-minute talk. People line up at the microphone.
The first person says, "I agree with everything you said in your talk, and I think you're brilliant." Then steps aside.
The second person says, "Slide 14 was beautiful, I learned a lot from it. You're awesome." Steps aside.
The third person—
Well, you'll never know what the third person at the microphone had to say, because by this time, you've fled screaming out of the room, propelled by a bone-deep terror as if Cthulhu had erupted from the podium, the fear of the impossibly unnatural phenomenon that has invaded your conference.
Yes, a group which can't tolerate disagreement is not rational. But if you tolerate only disagreement—if you tolerate disagreement but not agreement—then you also are not rational. You're only willing to hear some honest thoughts, but not others. You are a dangerous half-a-rationalist.
We are as uncomfortable together as flying-saucer cult members are uncomfortable apart. That can't be right either. Reversed stupidity is not intelligence.
Let's say we have two groups of soldiers. In group 1, the privates are ignorant of tactics and strategy; only the sergeants know anything about tactics and only the officers know anything about strategy. In group 2, everyone at all levels knows all about tactics and strategy.
Should we expect group 1 to defeat group 2, because group 1 will follow orders, while everyone in group 2 comes up with better ideas than whatever orders they were given?
In this case I have to question how much group 2 really understands about military theory, because it is an elementary proposition that an uncoordinated mob gets slaughtered.
Doing worse with more knowledge means you are doing something very wrong. You should always be able to at least implement the same strategy you would use if you are ignorant, and preferably do better. You definitely should not do worse. If you find yourself regretting your "rationality" then you should reconsider what is rational.
On the other hand, if you are only half-a-rationalist, you can easily do worse with more knowledge. I recall a lovely experiment which showed that politically opinionated students with more knowledge of the issues reacted less to incongruent evidence, because they had more ammunition with which to counter-argue only incongruent evidence.
We would seem to be stuck in an awful valley of partial rationality where we end up more poorly coordinated than religious fundamentalists, able to put forth less effort than flying-saucer cultists. True, what little effort we do manage to put forth may be better-targeted at helping people rather than the reverse—but that is not an acceptable excuse.
If I were setting forth to systematically train rationalists, there would be lessons on how to disagree and lessons on how to agree, lessons intended to make the trainee more comfortable with dissent, and lessons intended to make them more comfortable with conformity. One day everyone shows up dressed differently, another day they all show up in uniform. You've got to cover both sides, or you're only half a rationalist.
Can you imagine training prospective rationalists to wear a uniform and march in lockstep, and practice sessions where they agree with each other and applaud everything a speaker on a podium says? It sounds like unspeakable horror, doesn't it, like the whole thing has admitted outright to being an evil cult? But why is it not okay to practice that, while it is okay to practice disagreeing with everyone else in the crowd? Are you never going to have to agree with the majority?
Our culture puts all the emphasis on heroic disagreement and heroic defiance, and none on heroic agreement or heroic group consensus. We signal our superior intelligence and our membership in the nonconformist community by inventing clever objections to others' arguments. Perhaps that is why the atheist/libertarian/technophile/sf-fan/Silicon-Valley/programmer/early-adopter crowd stays marginalized, losing battles with less nonconformist factions in larger society. No, we're not losing because we're so superior, we're losing because our exclusively individualist traditions sabotage our ability to cooperate.
The other major component that I think sabotages group efforts in the atheist/libertarian/technophile/etcetera community, is being ashamed of strong feelings. We still have the Spock archetype of rationality stuck in our heads, rationality as dispassion. Or perhaps a related mistake, rationality as cynicism—trying to signal your superior world-weary sophistication by showing that you care less than others. Being careful to ostentatiously, publicly look down on those so naive as to show they care strongly about anything.
Wouldn't it make you feel uncomfortable if the speaker at the podium said that he cared so strongly about, say, fighting aging, that he would willingly die for the cause?
But it is nowhere written in either probability theory or decision theory that a rationalist should not care. I've looked over those equations and, really, it's not in there.
The best informal definition I've ever heard of rationality is "That which can be destroyed by the truth should be." We should aspire to feel the emotions that fit the facts, not aspire to feel no emotion. If an emotion can be destroyed by truth, we should relinquish it. But if a cause is worth striving for, then let us by all means feel fully its importance.
Some things are worth dying for. Yes, really! And if we can't get comfortable with admitting it and hearing others say it, then we're going to have trouble caring enough—as well as coordinating enough—to put some effort into group projects. You've got to teach both sides of it, "That which can be destroyed by the truth should be," and "That which the truth nourishes should thrive."
I've heard it argued that the taboo against emotional language in, say, science papers, is an important part of letting the facts fight it out without distraction. That doesn't mean the taboo should apply everywhere. I think that there are parts of life where we should learn to applaud strong emotional language, eloquence, and poetry. When there's something that needs doing, poetic appeals help get it done, and, therefore, are themselves to be applauded.
We need to keep our efforts to expose counterproductive causes and unjustified appeals, from stomping on tasks that genuinely need doing. You need both sides of it—the willingness to turn away from counterproductive causes, and the willingness to praise productive ones; the strength to be unswayed by ungrounded appeals, and the strength to be swayed by grounded ones.
I think the synagogue at their annual appeal had it right, really. They weren't going down row by row and putting individuals on the spot, staring at them and saying, "How much will you donate, Mr. Schwartz?" People simply announced their pledges—not with grand drama and pride, just simple announcements—and that encouraged others to do the same. Those who had nothing to give, stayed silent; those who had objections, chose some later or earlier time to voice them. That's probably about the way things should be in a sane human community—taking into account that people often have trouble getting as motivated as they wish they were, and can be helped by social encouragement to overcome this weakness of will.
But even if you disagree with that part, then let us say that both supporting and countersupporting opinions should have been publicly voiced. Supporters being faced by an apparently solid wall of objections and disagreements—even if it resulted from their own uncomfortable self-censorship—is not group rationality. It is the mere mirror image of what Dark Side groups do to keep their followers. Reversed stupidity is not intelligence.