Vladimir_Nesov comments on [Link] Review of "Doing Good Better" - Less Wrong Discussion
You are viewing a comment permalink. View the original post to see all comments and the full post content.
You are viewing a comment permalink. View the original post to see all comments and the full post content.
Comments (13)
I'd like to play the devil's advocate here for a moment. I'm not entirely sure how I should respond to the following argument.
That begs the question: people often disagree on what is a better state of things. (And of course they say those who disagree with you are not "decent".)
Don't ignore the fact that people agree on only a very small set of altrustic acts. And even then, many people are neutral about them, or almost so, or they only support them if they ignore the lost opportunities of e.g. giving money to them and not to those other less fortunate people.
The great majority of things people want, they don't want in common. Do you want to improve technology and medicine, or prevent unfriendly AI, or convert people to Christianity, or allow abortion, or free slaves, or prevent use of birth control, or give women equal legal rights, or make atheism legal, or prevent the disrespect and destruction of holy places, or remove speech restrictions, or allow free market contracts? Name any change you think a great historical moral advance, and you'll find people who fought against it.
Most great causes have people fighting for and against. This is unsurprising: when everyone is on the same side, the problem tends to be resolved quickly. The only things everyone agrees are bad, but which keep existing for decades, are those people are apathetic about - not the greatest moral causes of the day.
Does selecting causes for the widest moral consensus mean selecting the most inoffensive ones? If not, why not? Do you believe that impersonal and accidental forces of history generate as much misery, which you can fight against, as the deliberate efforts of people who disagree with you? Wouldn't that be surprising if it were true?
Do you disagree with the point you are making, or merely with the pro-book/anti-book side where it fits? I think being a devil's advocate is about the former, not the latter. (There is also the move of steelmanning a flaw, looking for a story that paints it as clearly bad, to counteract the drive to excuse it, which might be closer to what you meant.)
Btw, Scott recently wrote a post about issues with admitting controversial causes in altruism.
Like I said, I'm not sure if I agree with it yet. It's novel to me, it seems valid (up to empirical data I don't have yet), but I'm pretty sure I haven't thought through all its implications yet, or the other theories from its class. That's why I seek other opinions, particularly if someone has encountered this idea before.
"Devils advocate" was referring to the fact that this is an argument against EA, while I am generally in favor of EA.