You're looking at Less Wrong's discussion board. This includes all posts, including those that haven't been promoted to the front page yet. For more information, see About Less Wrong.
Comment author:DanArmak
06 October 2016 11:14:30PM
*
2 points
[-]
I'm confused by this post, and don't quite understand what its argument is.
Yes, emotional empathy does not optimize effective altruism, or your moral idea of good. But this is true of lots of emotions, desires and behaviors, including morally significant ones. You're singling out emotional empathy, but what makes it special?
If I buy an expensive gift for my father's birthday because I feel that fulfills my filial duty, you probably wouldn't tell me to de-emphasize filial piety and focus more on cognitive empathy for distant strangers. In general, I don't expect you to suggest people should spend all their resources on EA. Usually people designate a donation amount and then optimize the donation target, and it doesn't much matter what fuzzies you're spending your non-donation money on. So why de-fund emotional empathy in particular? Why not purchase fuzzies by spending money on buying treats for kittens, rather than reducing farm meat consumption?
Maybe your point is that emotional empathy feels morally significant and when we act on it, we can feel that we fulfilled our moral obligations. And then we would spend less "moral capital" on doing good. If so, you should want to de-fund all moral emotions, as long as this doesn't compromise your motivations for doing good, or your resources. Starting with most forms of love, loyalty, cleanliness and so on. Someone who genuinely feels doing good is their biggest moral concern would be a more effective altruist! But I don't think you're really suggesting e.g. not loving your family any more than distant strangers.
Maybe your main point is that empathy is a bias relative to your conscious goals:
When choosing a course of action that will make the world a better place, the strength of your empathy for victims is more likely to lead you astray that to lead you truly.
But the same can be said of pretty much any strong, morally entangled emotion. Maybe you don't want to help people who committed what you view as a moral crime, or who if helped will go on to do things you view as bad, or helping whom would send a signal to a third party that you don't want to be sent. Discounting such emotions may well match your idea of doing good. But why single out emotional empathy?
If people have an explicit definition of the good they want to accomplish, they can ignore all emotions equally. If they don't have an explicit definition, then it's just a matter of which emotions they follow in the moment, and I don't see why this one is worse than the others.
Yes, emotional empathy does not optimize effective altruism, or your moral idea of good. But this is true of lots of emotions, desires and behaviors, including morally significant ones. You're singling out emotional empathy, but what makes it special?
I agree with you that nothing makes them special. But you seem to view this as a reductio ad absurdum. Doing the same for all other emotions which might bias us or get in the way of doing what’s moral would not lead to a balanced lifestyle, to say the least.
But we could just as easily bite that bullet. Why should we expect optimizing purely for morality to lead to a balanced lifestyle? Why wouldn’t the 80/20 rule apply to moral concerns? Under this view, one would do best to amputate most parts of one’s mind that made them human, and add parts to become a morality maximizer.
Obviously this would cause serious problems in reality, and may not actually be the best way to maximize morality even if it was possible. This is just a sort of spherical cow in a vacuum level concept.
Comments (37)
I'm confused by this post, and don't quite understand what its argument is.
Yes, emotional empathy does not optimize effective altruism, or your moral idea of good. But this is true of lots of emotions, desires and behaviors, including morally significant ones. You're singling out emotional empathy, but what makes it special?
If I buy an expensive gift for my father's birthday because I feel that fulfills my filial duty, you probably wouldn't tell me to de-emphasize filial piety and focus more on cognitive empathy for distant strangers. In general, I don't expect you to suggest people should spend all their resources on EA. Usually people designate a donation amount and then optimize the donation target, and it doesn't much matter what fuzzies you're spending your non-donation money on. So why de-fund emotional empathy in particular? Why not purchase fuzzies by spending money on buying treats for kittens, rather than reducing farm meat consumption?
Maybe your point is that emotional empathy feels morally significant and when we act on it, we can feel that we fulfilled our moral obligations. And then we would spend less "moral capital" on doing good. If so, you should want to de-fund all moral emotions, as long as this doesn't compromise your motivations for doing good, or your resources. Starting with most forms of love, loyalty, cleanliness and so on. Someone who genuinely feels doing good is their biggest moral concern would be a more effective altruist! But I don't think you're really suggesting e.g. not loving your family any more than distant strangers.
Maybe your main point is that empathy is a bias relative to your conscious goals:
But the same can be said of pretty much any strong, morally entangled emotion. Maybe you don't want to help people who committed what you view as a moral crime, or who if helped will go on to do things you view as bad, or helping whom would send a signal to a third party that you don't want to be sent. Discounting such emotions may well match your idea of doing good. But why single out emotional empathy?
If people have an explicit definition of the good they want to accomplish, they can ignore all emotions equally. If they don't have an explicit definition, then it's just a matter of which emotions they follow in the moment, and I don't see why this one is worse than the others.
This actually has a name. It's called moral licensing.
I agree with you that nothing makes them special. But you seem to view this as a reductio ad absurdum. Doing the same for all other emotions which might bias us or get in the way of doing what’s moral would not lead to a balanced lifestyle, to say the least.
But we could just as easily bite that bullet. Why should we expect optimizing purely for morality to lead to a balanced lifestyle? Why wouldn’t the 80/20 rule apply to moral concerns? Under this view, one would do best to amputate most parts of one’s mind that made them human, and add parts to become a morality maximizer.
Obviously this would cause serious problems in reality, and may not actually be the best way to maximize morality even if it was possible. This is just a sort of spherical cow in a vacuum level concept.
If the 80/20 rules applies to moral concerns why do you think that getting rid of empty is part in the 20% that does 80%?