Anatoly_Vorobey comments on You're Calling *Who* A Cult Leader? - Less Wrong
You are viewing a comment permalink. View the original post to see all comments and the full post content.
You are viewing a comment permalink. View the original post to see all comments and the full post content.
Comments (112)
I read recently an article on charitable giving which mentioned how people split up their money among many different charities to, as they put it, "maximize the effect", even though someone with this goal should donate everything to the single highest-utility charity. And this seems a bit like the example you cited where, if blue cards came up randomly 75% of the time and red cards came up 25% of the time, people would bet on blue 75% of the time even though the optimal strategy is blue 100%. All this seems to come from concepts like "Don't put all your eggs in one basket", which is a good general rule for things like investing but can easily break down.
I find myself having to fight this rule for a lot of things, and one of them is beliefs. If all of my opinions are Eliezer-ish, I feel like I'm "putting all my eggs in one basket", and I need to "diversify".You use book recommendations as a reductio, but I remember reading about half the books on your recommended reading list, thinking "Does reading everything off of one guy's reading list make me a follower?" and then thinking "Eh, as soon as he stops recommending such good books, I'll stop reading them."
The other thing is the Outside View summed up by the proverb "If two people think alike, one of them isn't thinking." In the majority of cases I observe where a person conforms to all of the beliefs held by a charismatic leader of a cohesive in-group, and keeps praising that leader's incredible insight, that person is a sheeple and that leader has a cult (see: religion, Objectivism, various political movements). I respect the Outside View enough that I have trouble replacing it with the Inside View that although I agree with Eliezer about nearly everything and am willing to say arbitrarily good things about him, I'm certainly not a cultist because I'm coming to my opinions based on Independent Logic and Reason. I don't know any way of solving this problem except the hard way.
I tried to start a Hofstadter cult once. The first commandment was "Thou shalt follow the first commandment." The second commandment was "Thou shalt follow only those even-numbered commandments that do not exhort thee to follow themselves." I forget the other eight. Needless to say it didn't catch on.
If I have complete or near-complete trust in the information available to me about the charity's utility, as well as its short-term sustainability, that seems like the right decision to make.
But if I don't - if I'm inclined to treat data on overhead and estimates of utility as very noisy sources of data, out of skepticism or experience - is it irrational to prefer several baskets?
Similarly with knowledge and following reading lists, ideologies and the like.
Yes, even with great uncertainty, you should still put all your eggs into your best basket.
Did you mean this as a general rule, or specifically about this topic?
The literal example of eggs seems to indeed work well with multiple baskets, especially if they're all equally good.
Specifically on this topic.
The expected number of eggs lost is least if you choose the best basket and put all your eggs in it, but because of diminishing returns, you're better off sacrificing a few eggs to reduce the variance. However, your charitable donations are such a drop in the ocean that the utility curve is locally pretty much flat, so you just optimise for maximum expected gain.
This follows from the expected utility of the sum being the sum of the expected utility?
It follows from the assumption that you're not Bill Gates, don't have enough money to actually shift the marginal expected utilities of the charitable investment, and that charities themselves do not operate in an efficient market for expected utilons, so that the two top charities do not already have marginal expected utilities in perfect balance.
And that you care only about the benefits you confer, not the log of the benefits, or your ability to visualize someone benefited by your action, etc.
I don't see how either of these affect this result - unless you're saying it's easier to visualise one person with clean water and another with a malaria net than it is two people with clean water?
The sum of the affect raised is greater.
I don't understand I'm afraid, can you unpack that a bit please? Thanks.
Consider scope insensitivity. The amount of "warm fuzzies" one gets from helping X numbers of individuals with a given problem does not scale even remotely linearly with X. Different actions to help with distinct problems, however, sum in a much closer to linear fashion (at least up to some point).
Ergo, "one person with clean water and another with a malaria net" feels intuitively like you're doing more than "two people with clean water".
I think it means: the sum of the feel-good points of giving one person clean water and another a malaria net will, for most people, be higher than the feel-good points of giving two people clean water.
I'd like to get right whatever it is I'm doing wrong here, so if anyone would like to comment on any problems they see with this or the parent comment (which are both scored 0) I'd be grateful for your input.
EDIT: since this was voted down, but I didn't receive an explanation, I'm assuming it's just an attack, and so I don't need to modify what I do - thanks!
I suspect that the ability to visualize someone benefited by your action is often a proxy for being certain that your action actually helped someone, and that people often place additional value on that certainty. They might not be acting as perfectly rational economic agents in such cases, but I'm not sure I'd call such behavior irrational.
Very much so. Rational behavior is to maximize expected utility. When rational agents are risk-averse, they are risk-averse with respect to something that suffers from diminishing returns in utility, so that the possibility of negative surprises outweighs the possibility of positive surprises. "Time spent reading material from good sources" is a plausible example of something that has diminishing returns in utility so you want to spread it among baskets. Utility itself does not suffer from diminishing returns in utility. (Support to a charity might, but only if it's large relative to the charity. Or large relative to the things the charity might be doing to solve the problem it's trying to solve, I guess.)