Anatoly_Vorobey comments on You're Calling *Who* A Cult Leader? - Less Wrong

45 Post author: Eliezer_Yudkowsky 22 March 2009 06:57AM

You are viewing a comment permalink. View the original post to see all comments and the full post content.

Comments (112)

You are viewing a single comment's thread. Show more comments above.

Comment author: Anatoly_Vorobey 22 March 2009 01:42:56PM 8 points [-]

how people split up their money among many different charities to, as they put it, "maximize the effect", even though someone with this goal should donate everything to the single highest-utility charity.

If I have complete or near-complete trust in the information available to me about the charity's utility, as well as its short-term sustainability, that seems like the right decision to make.

But if I don't - if I'm inclined to treat data on overhead and estimates of utility as very noisy sources of data, out of skepticism or experience - is it irrational to prefer several baskets?

Similarly with knowledge and following reading lists, ideologies and the like.

Comment author: RobinHanson 22 March 2009 02:25:46PM 17 points [-]

Yes, even with great uncertainty, you should still put all your eggs into your best basket.

Comment author: thomblake 02 April 2009 02:27:31PM 2 points [-]

Did you mean this as a general rule, or specifically about this topic?

The literal example of eggs seems to indeed work well with multiple baskets, especially if they're all equally good.

Comment author: ciphergoth 02 April 2009 03:53:21PM 12 points [-]

Specifically on this topic.

The expected number of eggs lost is least if you choose the best basket and put all your eggs in it, but because of diminishing returns, you're better off sacrificing a few eggs to reduce the variance. However, your charitable donations are such a drop in the ocean that the utility curve is locally pretty much flat, so you just optimise for maximum expected gain.

Comment author: ciphergoth 22 March 2009 03:10:28PM 1 point [-]

This follows from the expected utility of the sum being the sum of the expected utility?

Comment author: Eliezer_Yudkowsky 22 March 2009 04:41:15PM 14 points [-]

It follows from the assumption that you're not Bill Gates, don't have enough money to actually shift the marginal expected utilities of the charitable investment, and that charities themselves do not operate in an efficient market for expected utilons, so that the two top charities do not already have marginal expected utilities in perfect balance.

Comment author: CarlShulman 22 March 2009 04:44:02PM 6 points [-]

And that you care only about the benefits you confer, not the log of the benefits, or your ability to visualize someone benefited by your action, etc.

Comment author: ciphergoth 24 March 2009 01:19:37PM 3 points [-]

I don't see how either of these affect this result - unless you're saying it's easier to visualise one person with clean water and another with a malaria net than it is two people with clean water?

Comment author: Eliezer_Yudkowsky 09 August 2009 05:57:15AM 3 points [-]

it's easier to visualise one person with clean water and another with a malaria net than it is two people with clean water?

The sum of the affect raised is greater.

Comment author: ciphergoth 10 August 2009 08:47:41AM 0 points [-]

I don't understand I'm afraid, can you unpack that a bit please? Thanks.

Comment author: SoullessAutomaton 10 August 2009 12:03:15PM 5 points [-]

Consider scope insensitivity. The amount of "warm fuzzies" one gets from helping X numbers of individuals with a given problem does not scale even remotely linearly with X. Different actions to help with distinct problems, however, sum in a much closer to linear fashion (at least up to some point).

Ergo, "one person with clean water and another with a malaria net" feels intuitively like you're doing more than "two people with clean water".

Comment author: orthonormal 10 August 2009 08:57:22PM 1 point [-]

Ergo, "one person with clean water and another with a malaria net" feels intuitively like you're doing more than "two people with clean water".

Well, not when you compare them against each other, but only when each is considered on its own: it's like this phenomenon.

Comment author: arundelo 10 August 2009 12:30:43PM 1 point [-]

I think it means: the sum of the feel-good points of giving one person clean water and another a malaria net will, for most people, be higher than the feel-good points of giving two people clean water.

Comment author: ciphergoth 02 April 2009 03:49:32PM *  2 points [-]

I'd like to get right whatever it is I'm doing wrong here, so if anyone would like to comment on any problems they see with this or the parent comment (which are both scored 0) I'd be grateful for your input.

EDIT: since this was voted down, but I didn't receive an explanation, I'm assuming it's just an attack, and so I don't need to modify what I do - thanks!

Comment author: Anatoly_Vorobey 22 March 2009 05:14:45PM 1 point [-]

I suspect that the ability to visualize someone benefited by your action is often a proxy for being certain that your action actually helped someone, and that people often place additional value on that certainty. They might not be acting as perfectly rational economic agents in such cases, but I'm not sure I'd call such behavior irrational.

Comment author: steven0461 22 March 2009 02:27:41PM *  9 points [-]

But if I don't - if I'm inclined to treat data on overhead and estimates of utility as very noisy sources of data, out of skepticism or experience - is it irrational to prefer several baskets?

Very much so. Rational behavior is to maximize expected utility. When rational agents are risk-averse, they are risk-averse with respect to something that suffers from diminishing returns in utility, so that the possibility of negative surprises outweighs the possibility of positive surprises. "Time spent reading material from good sources" is a plausible example of something that has diminishing returns in utility so you want to spread it among baskets. Utility itself does not suffer from diminishing returns in utility. (Support to a charity might, but only if it's large relative to the charity. Or large relative to the things the charity might be doing to solve the problem it's trying to solve, I guess.)