You're looking at Less Wrong's discussion board. This includes all posts, including those that haven't been promoted to the front page yet. For more information, see About Less Wrong.

Lumifer comments on Putanumonit - Discarding empathy to save the world - Less Wrong Discussion

7 Post author: Jacobian 06 October 2016 07:03AM

Comments (37)

You are viewing a single comment's thread. Show more comments above.

Comment author: Lumifer 10 October 2016 02:33:31PM *  1 point [-]

The argument that I was making or, maybe, just implying is a version of the argument for deontological ethics. It rests on two lemmas: (1) You will make mistakes; (2) No one is a villain in his own story.

To unroll a bit, people who do large-scale evil do not go home to stroke a white cat and cackle at their own evilness. They think they are the good guys and that they do what's necessary to achieve their good goals. We think they're wrong, but that's an outside view. As has been pointed out, the road to hell is never in need of repair.

Given this, it's useful to have firebreaks, boundaries which serve to stop really determined people who think they're doing good from doing too much evil. A major firebreak is emotional empathy -- it serves as a check on runaway optimization processes which are, of course, subject to the Law of Unintended Consequences.

And, besides, I like humans more than I like optimization algorithms :-P

Comment author: Jacobian 11 October 2016 01:11:07PM 0 points [-]

How about: doing evil (even inadvertently) requires coercion. Slavery, Nazis, tying a witch to a stake, you name it. Nothing effective altruists currently do is coercive (except to mosquitoes), so we're probably good. However, if we come up with a world improvement plan that requires coercing somebody, we should A) hear their take on it and B) empathize with them for a bit. This isn't a 100% perfect plan, but it seems to be a decent framework.

Comment author: Lumifer 11 October 2016 06:51:05PM 1 point [-]

I agree with gjm that evil does not necessarily require coercion. Contemplate, say, instigating a lynching.

The reason EAs don't do any coercion is because they don't have any power. But I don't see anything in their line of reasoning which would stop them from coercing other people in case they do get some power. They are not libertarians.