Anatoly_Vorobey comments on You're Calling *Who* A Cult Leader? - Less Wrong
You are viewing a comment permalink. View the original post to see all comments and the full post content.
You are viewing a comment permalink. View the original post to see all comments and the full post content.
Comments (112)
If I have complete or near-complete trust in the information available to me about the charity's utility, as well as its short-term sustainability, that seems like the right decision to make.
But if I don't - if I'm inclined to treat data on overhead and estimates of utility as very noisy sources of data, out of skepticism or experience - is it irrational to prefer several baskets?
Similarly with knowledge and following reading lists, ideologies and the like.
Yes, even with great uncertainty, you should still put all your eggs into your best basket.
Did you mean this as a general rule, or specifically about this topic?
The literal example of eggs seems to indeed work well with multiple baskets, especially if they're all equally good.
Specifically on this topic.
The expected number of eggs lost is least if you choose the best basket and put all your eggs in it, but because of diminishing returns, you're better off sacrificing a few eggs to reduce the variance. However, your charitable donations are such a drop in the ocean that the utility curve is locally pretty much flat, so you just optimise for maximum expected gain.
This follows from the expected utility of the sum being the sum of the expected utility?
It follows from the assumption that you're not Bill Gates, don't have enough money to actually shift the marginal expected utilities of the charitable investment, and that charities themselves do not operate in an efficient market for expected utilons, so that the two top charities do not already have marginal expected utilities in perfect balance.
And that you care only about the benefits you confer, not the log of the benefits, or your ability to visualize someone benefited by your action, etc.
I don't see how either of these affect this result - unless you're saying it's easier to visualise one person with clean water and another with a malaria net than it is two people with clean water?
The sum of the affect raised is greater.
I don't understand I'm afraid, can you unpack that a bit please? Thanks.
Consider scope insensitivity. The amount of "warm fuzzies" one gets from helping X numbers of individuals with a given problem does not scale even remotely linearly with X. Different actions to help with distinct problems, however, sum in a much closer to linear fashion (at least up to some point).
Ergo, "one person with clean water and another with a malaria net" feels intuitively like you're doing more than "two people with clean water".
Well, not when you compare them against each other, but only when each is considered on its own: it's like this phenomenon.
I think it means: the sum of the feel-good points of giving one person clean water and another a malaria net will, for most people, be higher than the feel-good points of giving two people clean water.
I'd like to get right whatever it is I'm doing wrong here, so if anyone would like to comment on any problems they see with this or the parent comment (which are both scored 0) I'd be grateful for your input.
EDIT: since this was voted down, but I didn't receive an explanation, I'm assuming it's just an attack, and so I don't need to modify what I do - thanks!
I suspect that the ability to visualize someone benefited by your action is often a proxy for being certain that your action actually helped someone, and that people often place additional value on that certainty. They might not be acting as perfectly rational economic agents in such cases, but I'm not sure I'd call such behavior irrational.
Very much so. Rational behavior is to maximize expected utility. When rational agents are risk-averse, they are risk-averse with respect to something that suffers from diminishing returns in utility, so that the possibility of negative surprises outweighs the possibility of positive surprises. "Time spent reading material from good sources" is a plausible example of something that has diminishing returns in utility so you want to spread it among baskets. Utility itself does not suffer from diminishing returns in utility. (Support to a charity might, but only if it's large relative to the charity. Or large relative to the things the charity might be doing to solve the problem it's trying to solve, I guess.)