The examples listed are not rational. They are examples of 'altruism' for the sake of a 'warm feeling' and signalling. Writing a letter, ringing a politician or giving blood are not actions that maximise your altruistic preferences!
You have responded to this 'Potential Objection' with the "better than nothing" argument but even with that in mind this is not about being rational. It is just a bunch of do-gooders exhorting each other to be more sacrificial. When we used to do this at church we would say it was about God... and premising on some of the accepted beliefs that may have been rational. But it definitely isn't here.
I make a call for a different response. I encourage people to resist the influence, suppress the irrational urge take actions that are neither optimal signals nor an optimal instrument for satisfying their altruistic values.
This isn't a religious community and 'rational' is not or should not be just the local jargon for 'anything asserted to be morally good'.
If my preferences were such that I valued eating babies then it would be rational for me to eat babies. Rational is not nice, good, altruistic or self sacrificial. It just is.
They are examples of 'altruism' for the sake of a 'warm feeling' and signalling.
Look, scoffing at less-than-optimal philanthropy is ultimately just another form of counterproductive negativity. If you're really serious about efficacy, you should be adding to the list of causes, not subtracting from it. That is, instead of responding to a post like this by encouraging people to
resist the influence, suppress the... urge [to] take actions
(!)
how about answering with "hey, you know what would be really, really helpful?" and proceeding to list some awesome utility-maximizing charity.
Warm feelings are good. Someone who donates a few spare frequent-flyer miles to help Curt Knox and Edda Mellas visit their daughter imprisoned 6,000 miles away doesn't need to feel ashamed of themselves for not being "rational" -- except in the extremely unlikely event that that action actually prevented them from doing something better. Does anyone honestly, seriously believe that discouraging people from doing things like this is a way of making the world a better place?
Speaking of challenges for LW, I propose a new rule: anybody who comes across an ostensi...
That is, instead of responding to a post like this by encouraging people to ...
how about answering with "hey, you know what would be really, really helpful?" and proceeding to list some awesome utility-maximizing charity.
No, no, NO! I desire to correct a fundamental mistake that is counter to whatever good 'rationality' may happen to provide. Raising the sanity waterline is an important goal in itself and particularly applicable in rare communities that have some hope of directing their actions in a way that is actually effective. Not only that, but seeing the very concept of rationality abused to manipulate people into bad decision making is something that makes me feel bad inside. Yes, it is the opposite of a warm fuzzy.
Look, scoffing at less-than-optimal philanthropy is ultimately just another form of counterproductive negativity. If you're really serious about efficacy, you should be adding to the list of causes, not subtracting from it.
You are fundamentally wrong and the use of labeling things that disagree with you as 'negative' is non-rational influence technique that works in most places but should be discouraged here. It is not counterproductive to not do...
An inefficient small good deed is a negated greater good deed requiring the same effort.
False. Time isn't fungible, and humans demonstrably don't make decisions that way.
Among other things, when humans are faced with too many alternatives, we usuallly choose "none of the above"... which means that the moment you complicate the question by even considering what those "greater good deeds might be", you dramatically reduce the probability that anything whatsoever will be accomplished.
Okay, I think I see what happened. Your original point was really this:
This isn't a religious community and 'rational' is not or should not be just the local jargon for 'anything asserted to be morally good
-- with which I agree. However, the following statements distracted from that point and confused me:
The examples listed are not rational. They are examples of 'altruism' for the sake of a 'warm feeling' and signalling
I make a call for a different response. I encourage people to resist the influence, suppress the irrational urge take actions that are neither optimal signals nor an optimal instrument for satisfying their altruistic values.
These made it sound like you were saying "No! Don't contribute to those causes! Doing so would be irrational, since they're not philanthropically optimal!" (I unfortunately have a high prior on that type of argument being made here.) My natural response, which I automatically fired off when I saw that your comment had 17 upvotes, is that there's nothing irrational about liking to do small good deeds (warm fuzzies) separately from saving the planet.
However, as I understand you now, you don't necessarily see anything wrong with those causes; it's just that you disapprove of the label "rationality" being used to describe their goodness -- rather than, say, just plain "goodness".
Is this right?
I also (perhaps unfairly) assumed my audience would follow along easily enough in my slight equivocation between "ethical" and "rational."
Not unfair, just more wrong. This is human bias. We identify with the in group identity and associate all morality and even epistemic beliefs with it. It doesn't matter whether it is godly, spiritual, professional, scientific, spiritual, enlightened, democratic or economic. We'll take the concept and associate it with whatever we happen to think is good of or be approved of by our peers. People will call things 'godly' even when they violate explicit instructions in their 'Word of God'. Because 'godly' really means 'what the tribe morality says right now'. People make the same error in thought when they use 'rational' to mean 'be nice' or even 'believe what I say'. This is ironic enough to be amusing if not for the prevalence of the error.
Rationalists should win
I hate to see this clever statement to be taken out of context and being reinterpreted as a moralizing slogan.
If you trace it back, this was originally a desideratum on rational thinking, not some general moral imperative.
It plainly says that if your supposedly "rational" strategy leads to a demonstrably inferior solution or gets beaten by some stupider looking agent, then the strategy must be reevaluated as you have no right to call it rational anymore.
Great post. I too am in strong disagreement with those who are skeptical of rationality's instrumental value. I find my life greatly enhanced by my (I believe) higher than average level of rationality.
These are some examples of ways rationality improved my life:
I don't go to religious services. Many people in my family attend services every week, and are bored to tears by it, but they feel guilty if they skip. My relative rationality leads to a major increase in happiness, since my Sunday mornings are totally free. (I had to attend church twice a week as a child and I hated it). I also have fewer ethical hangups generally speaking, leading to less experience of guilt and shame. I did experience these when I was younger and still woefully irrational, so it probably isn't just a matter of disposition.
I invest in an index fund (since they significantly outperform managed funds) to take advantage of compounding interest. Stunningly few people actually do this. I also bought in when the market was at a low (other non-professionals irrationally sell low and buy high).
I address health issues like diet and exercise using scientific evidence, rather than expensive hokum like televised "fat burners".
There are many more examples, but I think that illustrates my point. I'm sure that most other people on LW behave similarly on these issues. But often, people don't credit their rationality (good software) with their success, but rather their hardware (high intelligence).
How about spreading rationality?
This site, I suspect, mostly attracts high IQ analytical types who would have significantly higher levels of rationality than most people, even if they had never stumbled upon LessWrong.
It would be great if the community could come up with a plan (and implement it) to reach a wider audience. When I've sent LW/OB links to people who don't seem to think much about these topics, they often react with one of several criticisms: the post was too hard to read (written at too high of a level); the author was too arrogant (which I think women particularly dislike); or the topic was too obscure.
Some have tried to reach a wider audience. Richard Dawkins seems to want to spread the good word. Yet, I think sometimes he's too condescending. Bill Maher took on religion in his movie Religulous, but again, I think he turned a lot of people off with his approach.
A lot has been written here about why people think what they think and what prevents people from changing their minds. Why not use that knowledge to come up with a plan to reach a wider audience. I think the marginal payoff could be large.
"Solutions looking for problems."
Less wrongers of a more self-interest based frame of mind might consider banding together in groups of 2-4 and founding startups.
Or they could be nonprofits, if you prefer.
Upvoted because I like the topic, however this is a very confused post. Most examples given are not rational means to any end I think I have.
I'd love to see a post explaining why you think any of these recommendations are rational.
Doing nothing (or rather, doing what I'm already doing) conserves the resources I control, allows me to make better decisions later, when I have more information, and has intrinsic value (pleasure/leisure) to me.
Thanks for getting me up off my ass, simplicio. I just donated (a long over-due) $30 to SIAI and set up a monthly reminder to keep donating (if I have disposable income that month). And I'll be keeping an eye out here for other pro-rationality suggestions.
A small criticism:
Make an appointment to give blood.
The United States and the European Union currently have unscientific and discriminatory policies that forbid gay and bisexual men from donating. Giving blood is definitely an ethical Good Thing, but it may not be something we would want to push as specifically associated with rationality.
I donated blood just yesterday. Unfortunately, I'm AB+ which means my blood is only suitable to other AB+ people. About 5% of the population according to Wikipedia. On the plus side, I can receive blood from anyone :). I have to admit that I'm having trouble keeping a regular schedule of blood donation.
On the topic of diet, LessWrong helped me losing about 17 pounds through implementing some short term motivation methods. Counted in the probability of more years at life that's a huge win.
Can't speak for others, but I find rationality EXTREMELY useful instrumentally. Level 1 rationality is when you manage to stop your emotions from interfering with your cognitive process, which is hugely useful for improved decision-making. Level 2 rationality is when you pick up the trick of influencing emotions with your rational mind. I can't make myself feel an arbitrary emotion via rational thought, but over several months (or maybe more) of practice I've gotten very good at eliminating what I consider incorrect emotions whenever my rational mind id...
At all is the enemy of highest expected value.
You shouldn't worry so much about finding the best charity that you don't end up donating to one, but that doesn't mean it isn't worth taking the risk that you won't end up donating.
Suppose you have to choices: donate to charity A, or try and find the best charity you can and donate to that. You don't have much willpower, and if you go for the second choice, there's a 50:50 chance you won't end up donating at all. But if you do find a better charity, it could easily be an order of magnitude better. Over all, yo...
Briefly: If you want to use instrumental rationality to guide your actions, you need first to make explicit what you value. If we as a community want to use instrumental rationality to guide collective action, we need first to make explicit the values that we share. I think this is doable but not easy.
I like what I think is the motivating idea behind the original post: we want examples of people using the instrumental rationality they've learned here. If nothing else, this gives us more feedback on what seems to work and what doesn't; what is easy to imp...
A rational altruism story with a disappointing ending:
I recently moved across the country. I was originally planning to road-trip, but when I took my car to a mechanic in preparation for this, I discovered that it had a (possibly) cracked cylinder and could break down on such a trip.
I realized that this was a good opportunity for altruism, so with a week before my newly scheduled flight I tried to find a highly rated GiveWell charity to which I could donate my car, netting them the money they could obtain at auction without too much effort from me. I woul...
Finding a way to make money is probably a good idea. As far as I know, LW is on average quite poor, mostly because young=poor, but also because nerdy types don't actually tend to be good at making $.
If you're a "rationalist", but not rich and not on your way to being rich, you're probably just deluding yourself.
Vote this comment down if your net worth is >= $100k, vote up otherwise. (thanks to mathewnewport)
This epitomizes the problem with Less Wrongers' instrumental rationality:
I do find them convincing. Unfortunately, I don't find them motivating.
My lifestyle tends to be rather minimalistic so that even an average to low income is more the enough to sustain it. I also find it a lot easier to just forgo some comfort or gadget instead of working more to pay for it
We're a bunch of linux-hacking post-hippies who don't care enough about the socially accepted measures of influence ($ and status) to motivate ourselves to make a difference. Hence, we are sidelined by dumb cave-men who at least have enough fire in their bellies to win.
they may as well get that question right by paying sufficient attention to positional vs nonpositional goods.
True except that my intuition is that whpearson has somewhat got it wrong on that too, because he is trying to other-optimize non-nerds, who find keeping up with the latest fashion items highly enjoyable. They're like a hound that enjoys the thrill of the chase, separate from the meat at the end of it.
The user divia, in her most excellent post on spaced repetition software, quotes Paul Buchheit as saying
This is an important truth which bears repetition, and to which I shall return.
"Rationalists should win"
Many hands have been wrung hereabouts on the subject of rationality's instrumental value (or lack thereof) in the everyday lives of LWers. Are we winning? Some consider this doubtful.1
Now, I have a couple of issues with the question being framed in such a way.
Nonrandom acts of rationality
The LessWrong community finds itself in the fairly privileged position of being (1) mostly financially well-off; (2) well-educated and articulate; (3) connected; (4) of non-trivial size. Therefore, I would like to suggest a project for any & all users who might be interested.
Let us become a solution in search of problems.
Perform one or more manageable & modest rationally & ethically motivated actions between now and July 31, 2010 (indicate intent to participate, and brainstorm, below). These actions must have a reasonable chance of being an unequivocal net positive for the shared values of this community. Finally, post what you have done to this thread's comments, in as much detail as possible.
Some examples:
What about LessWrong acting as a group?
I would love to see a group-level action on our part occur; however, after some time spent brainstorming, I haven't hit upon any really salient ones that are not amenable to individual action. Perhaps a concerted letter-writing campaign? I suspect that is a weak idea, and that there are much better ones out there. Who's up for world-optimization?
Potential objection