Today I overhead a man in the supermarket telling his wife that maybe they should buy some lottery tickets, and I was reminded of Eliezer's "opening doors for little old ladies" line (which he repeated in his recent video answers).
Isn't buying lottery tickets also a form of purchasing warm fuzzies? I'm not sure that opening doors for little old ladies is any more defensible for a utilitarian than buying lottery tickets is for a rationalist.
To expand on this comparison a bit more, one important difference between the two is that once a person understands the concept of expected value, and knows that lottery tickets have expected value below purchase price, the warm-fuzzy effect largely goes away. But for some reason, at least for Eliezer, the warm-fuzzy effect of opening a door for an old lady doesn't go away, even though he knows that doing so creates negative expected utilons.
Perhaps the warm-fuzzy effect remains because Eliezer rationalizes it thus: if I can restore my willpower through the warm-fuzzy effect of opening doors for little old ladies, I can be more productive in producing utilons through my work, so it's really a good thing after all, and I deserve the warm-fuzzy effect. But perhaps a rationalist can use a similar line of thought to keep the warm-fuzzy effect of buying lottery tickets. Should one do so?
ETA: Apparently Eliezer already addressed the issue of lottery tickets, with the following conclusion:
Biases are lemons, not lemonade, and we shouldn't try to make lemonade out of them - just burn those lemons down.
Which seems completely inconsistent with the position he takes here...
Humans are social animals.
Buying lottery tickets seems less likley to trigger ancestral envrionment reward circuitry than having a positive interaction with another person. Windfall from the capricious environment seems a worse bet than good will towards you in a small tribe where word gets around. This is even completely ignoring the plausible root of most altruism in kin selection.
I know this is an old comment, but...
I think holding open doors for old ladies is not only defensible, but entirely practical for utilitarians.
First, there's plenty of research suggesting that little actions like this can have significant spillover effects on our attitudes for some time after . Second, how exactly are you going to convert that handful of seconds into a higher utility payoff? Are you going to stay at work for an extra hour so that, if you run into some people in need of assistance after you leave, you can not help them? Are you going to stand there on the other side of the door and think about important AI problems while the old lady struggles to open it?
Time is a great deal less fungible than money.
Are you going to stand there on the other side of the door and think about important AI problems while the old lady struggles to open it?
I visualized this scenario and laughed out loud.
Kiva.org has the distinct honor of being the only charity that has ensured me maximum utilons for my money with an unexpected bonus of most fuzzies experienced ever. Seeing my money being repaid and knowing that it was possible only because my charity dollars worked, that the recipient of my funds actually put the dollars to effective use enough to thrive and pay back my money, well, goddamn it felt good.
kiva feels suspiciously well-optimized on three counts -- there's the utilons (which, given that you're incentivizing industry and entrepreneurship, are pretty darn good), the warm fuzzies you mentioned, and the fact that it seems it could also help me overcome some akrasia with regards to savings. If I loan money out of my paycheck to kiva each month, and reinvest all money repaid, then (assuming a decent repayment rate), the money cycling should tend to increase, meaning that if I need to, say, put a down payment on a house one day, I can take some out, knowing it's already done good.
I feel very suspicious of my mind for being convinced this plan is optimal along one dimension, and extremely strong along two others. It doesn't seem as though it should be so easy. If I'm missing something (along any dimension), please feel free to tell me.
if X is such a great option, why is it not more popular?
I begin to suspect that rationalists should simply delete this question from their mental vocabularies. Most popular things are optimized to be popular with an audience that doesn't know how to resist manipulation (but thinks itself invincible, in accordance with the bias blind-spot bias); this gives rise to a case of the majority is always wrong.
This is probably too late to be read by anyone, but here is a column by Steven Landsburg, my favorite economist, which says essentially the same thing; he simply phrases it as "don't diversify your charity 'portfolio': shut up, multiply, and give your whole charity budget to the most deserving one".
"Utilons" isn't quite the right word: utilons are all I purchase. My utility function is a sum of components: I can decompose it into a local part to do with my happiness and the happiness of those close to me (and thus status, warm fuzzies and the like) and a global part to do with things like the lives of strangers and the future of humanity. I try to strongly mark the boundary between those two, so I don't for example value the lives of people in the same country as me more than those in different countries.
You're saying I can more optimally spend resources on efforts that clearly serve one or the other than on efforts that try to do both and do neither well, and I agree, I'd just phrase it differently: purchase big-picture utility and small-picture utility separately.
Perhaps Eliezer doesn't directly value fuzzies or status, so when he is purchasing them he isn't purchasing utilons directly. Rather, he is purchasing motivation to continue doing things which directly purchase utilons. In other words, he doesn't really want fuzzies, but if he doesn't buy any, he'll lose his motivation to be altruistic altogether. So he buys fuzzies to keep his motivation up which allows him to keep directly purchasing utilons - the things he does actually value. That's at least how I read it.
edit: clarity
By coincidence, I already do this - donating two hours a week to a local charity for children with learning difficulties, and donating cash to the Gates foundation (they seem much better qualified than me to calculate the expected return on charity investement), and a portion of the charity donations from my upcoming wedding is earmarked for the "establishement at which Eliezer works". I actually did it following the logic of this post, so it wan't a coincidence either.
This may be the good reaction for rationalists, but how do you feel about it, ...
Man doesn't live by warmth, status and altruism alone however.
For a more comprehensive list of things that may contribute to efficacy I would suggest Aristotle's Ethics or Tim Ferris' blog and 4 Hour Work Week.
Worth pointing out that one probably has to mix up the warmth and status generating activities quite a bit to avoid diminishing returns too. The first large check you give away will surely be very effective, after the fifth, maybe not. Tipping generously when its appropriate will always buy status and warmth with Prospect Theory's predicted effective multiplier for small numbers around a reference point.
Do you view this more as "Your true utility isoclines are concave in the plane of utilons vs. fuzzies", or, "You are not a rational utility-maximizer"?
I would have made this into a longer post, but it works much better appended to this one:
It's clear that you can't just make willpower appear with a snap of your fingers, so I consider fuzzies to be utilons for many human utility functions. However, utilitarians have it even better -- if they get fuzzies by giving fuzzies to someone else, they get to count all of the fuzzies generated as utilons. I urge people focused on being effective utilitarians to keep this in mind if they feel like they're running low on fuzzies.
Thanks, Eliezer!
This one was actually news to me. Separately is more efficient, eh? Hmmm... now I get to rethink my actions.
I had deliberately terminated my donations to charities that seemed closer to "rescuing lost puppies". I had also given up personal volunteering (I figured out {work - earn - donate} before I heard it here.) And now I'm really struggling with akrasia / procrastination / laziness /rebellion / escapism.
"You could, of course, reply that you don't trust selfish acts that are supposed to be other-benefiting as an "...
But if that's the defense, then my act can't be defended as a good deed, can it? For these are self-directed benefits that I list.
I'm glad I'm an egoist and don't have to worry about stuff like this.
Maybe relevant to this post: the googolplex dust specks issue seems to be settled by nonlinearity/proximity.
Other people's suffering is non-additive because we value different people differently. The pain of a relative matters more to me than the pain of a stranger. A googolplex people can't all be important to me because I don't have enough neural circuitry for that. (Monkeysphere is about 150 people.) This means each subsequent person-with-dust-speck means less to me than the previous one, because they're further from me. The infinite sum may converge to...
I may be straying from your main point here, but...
Could you really utilize these 60 seconds in a better, more specialized way? Not any block of 60 seconds - these specific 60 seconds, that happened during your walk.
Had you not encountered that open trunk, would you open your laptop in the middle of that walk and started working on a world changing idea or an important charity plan? Unlikely - if that was the case you were already sitting somewhere working on that. You went out for a walk, not for work.
Would you, had you not encountered that open trunk, fi...
I follow the virtue-ethics approach, I do actions that make me like the person that I want to be. The acquisition of any virtue requires practice, and holding open the door for old ladies is practice for being altruistic. If I weren't altruistic, then I wouldn't be making myself into the person I want to be.
It's a very different framework from util maximization, but I find it's much more satisfying and useful.
My plan to get myself to purchase utilons is to find the most efficient fuzzy cause I can, then say "This is how important it is for me to save my money." Then when the time comes to purchase utilons, I can say "You know that all-important fuzzy cause? Turns out there are even better causes out there, although they might not make you feel quite so warm inside."
Try to wean yourself off the need for warm fuzzies instead.
EDIT: No, don't try to wean yourself off the warm fuzzies, but get the warm fuzzies from friends and family, not from people in distress in need of charity. Feel good about yourself because you are achieving your goals, including altruistic ones. (end of edit)
Carl Rogers, founder of person centred counselling, theorised that there is an "organismic self", with all the attributes and abilities of the human organism within its own skin, and a "self-concept" built up from what the ...
I'm posting here because I just saw a Facebook campaign to raise awareness of child abuse via profile pictures and chain letters. It takes little effort, and the marginal utility seemed extremely low, but I understand now that it is actually very good at generating fuzzies. As a side effect, the fuzzies provide an incentive, and it does generate a little bit of utility, so it's not entirely a bad thing unless it causes people to fall into the "I've done my bit for the world" mindset.
Just a note: The established term for "a hypothetical unit of utility" is "util" or "utile" (typically pronounced "yootle").
Why is it acceptable overhead to cater to primate impulses to the tune of $110,000?
I get that part of rationality is making the most of the faulty brain you have, but I'm not clear on the right way to decide which instincts to fight against, and which to placate.
And why does having more money justify spending SO more on fuzzies or status? wouldn't it be cheaper to have a social circle of non-billionaires, and have top-dog status by being the first with a macbook air? (which you would have bought anyway)
No. MUCH more expensive. It would hurt your business interests. Really, $110,000 is an order of magnitude too little for a billionaire.
Does anyone really track the marginal utility of their possible investment this way? Utilons - sure. But ROI on status? ROI on "warm fuzzies"?
Also, this assumes we have good estimates of the ROI on all our options. Where do these estimates come from? In the real world, we often seem to spread our bets - constantly playing a game of multi-armed bandit with concept drift.
There are more basic desires as well. The warm fuzzy feeling may go towards attracting the opposite sex. I had my testosterone reduced, first through drugs, and then through castration. Did I overcome biases and become "less wrong"? Yes, I did. Any questions are welcome.
But in my own case, since I already work in the nonprofit sector, the further question arises as to whether I could have better employed the same sixty seconds in a more specialized way, to bring greater benefit to others.
You're assuming that time, rather than stamina, is the limiting factor to the amount of work you can get done in a given day, but if that's so then sleeping is the hugest time waste ever. (Or that telling someone that their trunk is open costs as much stamina as the same amount of time spent on your day job.)
(And then there are acausal...
I wouldn't state the motivation for a "diverse charity portfolio" as positively desiring warm fuzzies-- rather, I think the aversion to a mixed set (note that I doubt we would usually want an only-hands-on set of charities-- too much work and would feel like pushing a boulder up a hill) is about potential exhaustion at repeatedly doing the one "most efficient" thing to the point that you're not taking 60 seconds of mental refreshment. Psychological viability is the missing element here, causing us to intuitively sense that the proposal isn't actually best, utility calculation be as it may (the actual calc would not have such problems).
the concentrated euphoria of being present in person when you turn a single human's life around
Should read "when you try to turn a single human's life around".
I have no idea what the actual percentage is, but I know there are people that are damned to suffering no matter how many helping hands they get. For reference, look at lottery winners who blow it all & end up back in poverty, or even in debt.
All this is quite right, but misses an essential point:
How can we align those three goals more effectively?
How can we use the human desire for status and warm fuzzies to maximize utilons?
And on the topic, I saved a sheep Friday. I felt all warm and fuzzy, and not just from the wool, but as I rode away I noted that most of what I know about such animals involves eating them!
Is the value of an action determined from the recipient or the giver? Using the example of telling someone their trunk is open, your cost in the action is 60 seconds and the benefit to them was... what? I suppose that would depend on the rest of the context. (Was it about to rain? Valuables in the car?) The example with the lawyer has more numbers available but the starting point of "worth" needs to be determined before the correct action can be determined.
This is only slightly relevant to this post, however. If warm fuzzies are the desired benefit, the cost/benefit ratio can completely ignore the recipient. (Technically, you are the recipient?) The same goes for status.
Yesterday:
I hold open doors for little old ladies. I can't actually remember the last time this happened literally (though I'm sure it has, sometime in the last year or so). But within the last month, say, I was out on a walk and discovered a station wagon parked in a driveway with its trunk completely open, giving full access to the car's interior. I looked in to see if there were packages being taken out, but this was not so. I looked around to see if anyone was doing anything with the car. And finally I went up to the house and knocked, then rang the bell. And yes, the trunk had been accidentally left open.
Under other circumstances, this would be a simple act of altruism, which might signify true concern for another's welfare, or fear of guilt for inaction, or a desire to signal trustworthiness to oneself or others, or finding altruism pleasurable. I think that these are all perfectly legitimate motives, by the way; I might give bonus points for the first, but I wouldn't deduct any penalty points for the others. Just so long as people get helped.
But in my own case, since I already work in the nonprofit sector, the further question arises as to whether I could have better employed the same sixty seconds in a more specialized way, to bring greater benefit to others. That is: can I really defend this as the best use of my time, given the other things I claim to believe?
The obvious defense—or perhaps, obvious rationalization—is that an act of altruism like this one acts as an willpower restorer, much more efficiently than, say, listening to music. I also mistrust my ability to be an altruist only in theory; I suspect that if I walk past problems, my altruism will start to fade. I've never pushed that far enough to test it; it doesn't seem worth the risk.
But if that's the defense, then my act can't be defended as a good deed, can it? For these are self-directed benefits that I list.
Well—who said that I was defending the act as a selfless good deed? It's a selfish good deed. If it restores my willpower, or if it keeps me altruistic, then there are indirect other-directed benefits from that (or so I believe). You could, of course, reply that you don't trust selfish acts that are supposed to be other-benefiting as an "ulterior motive"; but then I could just as easily respond that, by the same principle, you should just look directly at the original good deed rather than its supposed ulterior motive.
Can I get away with that? That is, can I really get away with calling it a "selfish good deed", and still derive willpower restoration therefrom, rather than feeling guilt about it being selfish? Apparently I can. I'm surprised it works out that way, but it does. So long as I knock to tell them about the open trunk, and so long as the one says "Thank you!", my brain feels like it's done its wonderful good deed for the day.
Your mileage may vary, of course. The problem with trying to work out an art of willpower restoration is that different things seem to work for different people. (That is: We're probing around on the level of surface phenomena without understanding the deeper rules that would also predict the variations.)
But if you find that you are like me in this aspect—that selfish good deeds still work—then I recommend that you purchase warm fuzzies and utilons separately. Not at the same time. Trying to do both at the same time just means that neither ends up done well. If status matters to you, purchase status separately too!
If I had to give advice to some new-minted billionaire entering the realm of charity, my advice would go something like this:
I would furthermore advise the billionaire that what they spend on utilons should be at least, say, 20 times what they spend on warm fuzzies—5% overhead on keeping yourself altruistic seems reasonable, and I, your dispassionate judge, would have no trouble validating the warm fuzzies against a multiplier that large. Save that the original, fuzzy act really should be helpful rather than actively harmful.
(Purchasing status seems to me essentially unrelated to altruism. If giving money to the X-Prize gets you more awe from your friends than an equivalently priced speedboat, then there's really no reason to buy the speedboat. Just put the money under the "impressing friends" column, and be aware that this is not the "altruism" column.)
But the main lesson is that all three of these things—warm fuzzies, status, and expected utilons—can be bought far more efficiently when you buy separately, optimizing for only one thing at a time. Writing a check for $10,000,000 to a breast-cancer charity—while far more laudable than spending the same $10,000,000 on, I don't know, parties or something—won't give you the concentrated euphoria of being present in person when you turn a single human's life around, probably not anywhere close. It won't give you as much to talk about at parties as donating to something sexy like an X-Prize—maybe a short nod from the other rich. And if you threw away all concern for warm fuzzies and status, there are probably at least a thousand underserved existing charities that could produce orders of magnitude more utilons with ten million dollars. Trying to optimize for all three criteria in one go only ensures that none of them end up optimized very well—just vague pushes along all three dimensions.
Of course, if you're not a millionaire or even a billionaire—then you can't be quite as efficient about things, can't so easily purchase in bulk. But I would still say—for warm fuzzies, find a relatively cheap charity with bright, vivid, ideally in-person and direct beneficiaries. Volunteer at a soup kitchen. Or just get your warm fuzzies from holding open doors for little old ladies. Let that be validated by your other efforts to purchase utilons, but don't confuse it with purchasing utilons. Status is probably cheaper to purchase by buying nice clothes.
And when it comes to purchasing expected utilons—then, of course, shut up and multiply.