Solvent comments on Morality Isn't Logical - Less Wrong Discussion
You are viewing a comment permalink. View the original post to see all comments and the full post content.
You are viewing a comment permalink. View the original post to see all comments and the full post content.
Comments (85)
Yeah, I'm pretty sure I (and most LWers) don't agree with you on that one, at least in the way you phrased it.
You think they'd prefer that the guy that caused everyone else in the universe to suffer didn't suffer himself?
Here's an old Eliezer quote on this:
It's pretty hard to argue about this if our moral intuitions disagree. But at least, you should know that most people on LW disagree with you on this intuition.
EDIT: As ArisKatsaris points out, I don't actually have any source for the "most people on LW disagree with you" bit. I've always thought that not wanting harm to come to anyone as an instrumental value was a pretty obvious, standard part of utilitarianism, and 62% of LWers are consequentialist, according to the 2012 survey. The post "Policy Debates Should Not Appear One Sided" is fairly highly regarded, and it esposes a related view, that people don't deserve harm for their stupidity.
Also, what those people would prefer isn't nessecarily what our moral system should prefer- humans are petty and short-sighted.
Thank you, that's a good start.
Yes, I had concluded that EY was anti retribution. Hadn't concluded that he had carried the day on that point.
I don't think vengeance and retribution are "ideas" that people had to come up with - they're central moral motivations. "A social preference for which we punish violators" gets at 80% of what morality is about.
Some may disagree about the intuition, but I'd note that even EY had to "renounce" all hatred, which implies to me that he had the impulse for hatred (retribution, in this context) in the first place.
This seems like it has makings of an interesting poll question.
I agree. Let's do that. You're consequentialist, right?
I'd phrase my opinion as "I have terminal value for people not suffering, including people who have done something wrong. I acknowledge that sometimes causing suffering might have instrumental value, such as imprisonment for crimes."
How do you phrase yours? If I were to guess, it would be "I have a terminal value which says that people who have caused suffering should suffer themselves."
I'll make a Discussion post about this after I get your refinement of the question?
I'd suggest the following two phrasings:
Perhaps also add a third choice:
What do you mean by "utilitarianism"? The word has two different common meanings around here: any type of consequentialism, and the specific type of consequentialism that uses "total happiness" as a utility function. This sentence appears to be designed to confuse the two meanings.
That is most definitely not the main point of that post.
Yeah, my mistake. I'd never run across any other versions of consequentialism apart from utilitarianism (except for Clippy, of course). I suppose caring only for yourself might count? But do you seriously think that the majority of those consequentialists aren't utilitarian?
Well, even Eliezer's version of consequentialism isn't simple utilitarianism for starters.
It's a kind of utilitarianism. I'm including act utilitarianism and desire utilitarianism and preference utilitarianism and whatever in utilitarianism.
Ok, what is your definition of "utilitarianism"?
[citation needed]
I edited my comment to include a tiny bit more evidence.
I would, all else being equal. Suffering is bad.