Amanojack comments on Conceptual Analysis and Moral Theory - Less Wrong
You are viewing a comment permalink. View the original post to see all comments and the full post content.
You are viewing a comment permalink. View the original post to see all comments and the full post content.
Comments (456)
I don't claim moral sentiments are correct, but simply that a person's moral sentiment is their moral sentiment. They feel some emotions, and that's all I know. You are seeming to say there is some way those emotions can be correct or incorrect, but in what sense? Or probably a clearer way to ask the question is, "What disadvantage can I anticipate if my emotions are incorrect?"
An emotion, such as a feeling of elation or disgust, is not correct or incorrect per se; but an emotion per se is no basis for a moral sentiment, because moral sentiment has to be about something. You could think gay marriage is wrong because homosexuality disgusts you, or you could feel serial-killing is good because it elates you, but that doesn't mean the conclusions you are coming to are right. It may be a cast iron fact that you have those particular sentiments, but that says nothing about the correctness of their content, any more than any opinion you entertain is automatically correct.
ETA The disadvantages you can expect if your emotions are incorrect include being in the wrong whilst feeling you are in the right. Much as if you are entertaining incorrect opinions.
What if I don't care about being wrong (if that's really the only consequence I experience)? What if I just want to win?
Then you are, or are likely to be, morally in the wrong. That is of course possible. You can choose to do wrong. But it doesn't constitute any kind of argument. Someone can elect to ignore the roundness of the world for some perverse reason, but that doesn't make "!he world is round" false or meaningless or subjective.
Indeed it is not an argument. Yet I can still say, "So what?" I am not going to worry about something that has no effect on my happiness. If there is some way it would have an effect, then I'd care about it.
The difference is, believing "The world is round" affects whether I win or not, whereas believing "I'm morally in the wrong" does not.
That is apparently true in your hypothetical, but it's not true in the real world. Just as the roundness of the world has consequences, the wrongness of an action has consequences. For example, if you kill someone, then your fate is going to depend (probabilistically) on whether you were in the right (e.g. he attacked and you were defending your life) or in the wrong (e.g. you murdered him when he caught you burgling his house). The more in the right you were, then, ceteris paribus, the better your chances are.
You're interpreting "I'm morally in the wrong" to mean something like, "Other people will react badly to my actions," in which case I fully agree with you that it would affect my winning. Peterdjones apparently does not mean it that way, though.
Actually I am not. I am interpreting "I'm morally wrong" to mean something like, "I made an error of arithmetic in an area where other people depend on me."
An error of arithmetic is an error of arithmetic regardless of whether any other people catch it, and regardless of whether any other people react badly to it. It is not, however, causally disconnected from their reaction, because, even though an error of arithmetic is what it is regardless of people's reaction to it, nevertheless people will probably react badly to it if you've made it in an area where other people depend on you. For example, if you made an error of arithmetic in taking a test, it is probably the case that the test-grader did not make the same error of arithmetic and so it is probably the case that he will react badly to your error. Nevertheless, your error of arithmetic is an error and is not merely getting-a-different-answer-from-the-grader. Even in the improbable case where you luck out and the test grader makes exactly the same error as you and so you get full marks, nevertheless, you did still make that error.
Even if everyone except you wakes up tomorrow and believes that 3+4=6, whereas you still remember that 3+4=7, nevertheless in many contexts you had better not switch to what the majority believe. For example, if you are designing something that will stand up, like a building or a bridge, you had better get your math right, you had better correctly add 3+4=7 in the course of designing the edifice if that sum is ever called on calculating whether the structure will stand up.
If humanity divides into two factions, one faction of which believes that 3+4=6 and the other of which believes that 3+4=7, then the latter faction, the one that adds correctly, will in all likelihood over time prevail on account of being right. This is true even if the latter group starts out in the minority. Just imagine what sort of tricks you could pull on people who believe that 3+4=6. Because of the truth of 3+4=7, eventually people who are aware of this truth will succeed and those who believe that 3+4=6 will fail, and over time the vast majority of society will once again come to accept that 3+4=7.
And similarly with morality.
In sum, you seem to be saying that morality involves arithmetic, and being wrong about arithmetic can hurt me, so being wrong about morality can hurt me.
There's no particular connection between morality and arithmetic that I'm aware of. I brought up arithmetic to illustrate a point. My hope was that arithmetic is less problematic, less apt to lead us down philosophical blind allies, so that by using it to illustrate a point I wasn't opening up yet another can of worms.
Nothing's jumping out at me that would seriously impact a group's effectiveness from day to day. I rarely find myself needing to add three and four in particular, and even more rarely in high-stakes situations. What did you have in mind?
Suppose you think that 3+4=6.
I offer you the following deal: give me $3 today and $4 tomorrow, and I will give you a 50 cent profit the day after tomorrow, by returning to you $6.50. You can take as much advantage of this as you want. In fact, if you like, you can give me $3 this second, $4 in one second, and in the following second I will give you back all your money plus 50 cents profit - that is, I will give you $6.50 in two seconds.
Since you think that 3+4=6, you will jump at this amazing deal.
Whether someone is judged right and wrong by others has consequences, but the people doing the judging might be wrong. It is still an error to make morality justify itself in terms of instrumental utility, since there are plenty of examples of things that are instrumentally right but ethically wrong, like improved gas chambers.
Actually being in the right increases your probability of being judged to be in the right. Yes, the people doing the judging may be wrong, and that is why I made the statement probabilistic. This can be made blindingly obvious with an example. Go to a random country and start gunning down random people in the street. The people there will, with probability so close to 1 as makes no real difference, judge you to be in the wrong, because you of course will be in the wrong.
There is a reason why people's judgment is not far off from right. It's the same reason that people's ability to do basic arithmetic when it comes to money is not far off from right. Someone who fails to understand that $10 is twice $5 (or rather the equivalent in the local currency) is going to be robbed blind and his chances of reproduction are slim to none. Similarly, someone whose judgment of right and wrong is seriously defective is in serious trouble. If someone witnesses a criminal lunatic gun down random people in the street and then walks up to him and says, "nice day", he's a serious candidate for a Darwin Award. Correct recognition of evil is a basic life skill, and any human who does not have it will be cut out of the gene pool. And so, if you go to a random country and start killing people randomly, you will be neutralized by the locals quickly. That's a prediction. Moral thought has predictive power.
The only reason anyone can get away with the mass murder that you allude to is that they have overwhelming power on their side. And even they did it in secret, as I recall learning, which suggests that powerful as they were, they were not so powerful that they felt safe murdering millions openly.
Morality is how a human society governs itself in which no single person or organized group has overwhelming power over the rest of society. It is the spontaneous self-regulation of humanity. Its scope is therefore delimited by the absence of a person or organization with overwhelming power. Even though just about every place on Earth has a state, since it is not a totalitarian state there are many areas of life in which the state does not interfere, and which are therefore effectively free of state influence. In these areas of life humanity spontaneously self-regulates, and the name of the system of spontaneous self-regulation is morality.
Except when the evil guys take over, Then you are in trouble if you oppose them.
That doesn't affect my point. If there are actual or conceptual circumstances where instrumental good diverges from moral good, the two cannot be equated.
Why would it be wrong if they do? You theory of morality seems to be in need of another theory of morality to justify it.
Which is why the effective scope of morality is limited by concentrated power, as I said.
I did not equate moral good with instrumental good in the first place.
I didn't say it would be wrong. I was talking about making predictions. The usefulness of morality in helping you to predict outcomes is limited by concentrated power.
On the contrary, my theory of morality is confirmed by the evidence. You yourself supplied some of the evidence. You pointed out that a concentration of power creates an exception to the prediction that someone who guns down random people will be neutralized. But this exception fits with my theory of morality, since my theory of morality is that it is the spontaneous self-regulation of humanity. Concentrated power interferes with self-regulation.
It sounds to me like you're describing the ability to recognize danger, not evil, there.
Say that your hypothetical criminal lunatic manages to avoid the police, and goes about his life. Later that week, he's at a buffet restaurant, acting normally. Is he still evil? Assuming nobody recognizes him from the shooting, do you expect the other people using the buffet to react unusually to him in any way?
It's not either/or. There is no such thing as a bare sense of danger. For example, if you are about to drive your car off a cliff, hopefully you notice in time and stop. In that case, you've sensed danger - but you also sensed the edge of a cliff, probably with your eyes. Or if you are about to drink antifreeze, hopefully you notice in time and stop. In that case, you've sensed danger - but you've also sensed antifreeze, probably with your nose.
And so on. It's not either/or. You don't either sense danger or sense some specific thing which happens to be dangerous. Rather, you sense something that happens to be dangerous, and because you know it's dangerous, you sense danger.
Chances are higher than average that if he was a criminal lunatic a few days ago, he is still a criminal lunatic today.
Obviously not, because if you assume that people fail to perceive something, then it follows that they will behave in a way that is consistent with their failure to perceive it. Similarly, if you fail to notice that the antifreeze that you're drinking is anything other than fruit punch, then you can be expected to drink it just as if it were fruit punch.
The fact that you are amoral does not mean there is anything wrong with morality, and is not an argument against it. You might as well be saying "there is a perfectly good rational argument that the world is round, but I prefer to be irrational".
That doesn't constitute an argument unless you can explain why your winning is the only thing that should matter.
Yeah, I said it's not an argument. Yet again I can only ask, "So what?" (And this doesn't make me amoral in the sense of not having moral sentiments. If you tell me me it is wrong to kill a dog for no reason, I will agree because I will interpret that as, "We both would be disgusted at the prospect of killing a dog for no reason." But you seem to be saying there is something more.)
The wordings "affect my winning" and "matter" mean the same thing to me. I take "The world is round" seriously because it matters for my actions. I do not see how "I'm morally in the wrong"* matters for my actions. (Nor how "I'm pan-galactically in the wrong" matters. )
*EDIT: in the sense that you seem to be using it (quite possibly because I don't know what that sense even is!).
So being wrong and not caring you are in the wrong is not the same as being right.
Yes. I am saying that moral sentiments can be wrong, and that that can be realised through reason, and that getting morality right matters more than anything.
But they don't mean the same thing. Morality matters more than anything else by definition. You don't prove anything by adopting an idiosyncratic private language.
The question is whether mattering for your actions is morally justifiable.
Yet I still don't care, and by your own admission I suffer not in the slightest from my lack of caring.
Zorg says that getting pangalacticism right matters more than anything. He cannot tell us why it matters, but boy it really does matter.
Which would be? If you refer me to the dictionary again, I think we're done here.
The fact that you are not going to worry about morality, does not make morality a) false b) meaningless or c) subjective. Can I take it you are no longer arguing for any of claims a) b) or c) ?
You have not succeeded in showing that winning is the most important thing.
I've never argued (a), I'm still arguing (actually just informing you) that the words "objective morality" are meaningless to me, and I'm still arguing (c) but only in the sense that it is equivalent to (b): in other words, I can only await some argument that morality is objective. (But first I'd need a definition!)
I'm using the word winning as a synonym for "getting what I want," and I understand the most important thing to mean "what I care about most." And I mean "want" and "care about" in a way that makes it tautological. Keep in mind I want other people to be happy, not suffer, etc. Nothing either of us have argued so far indicates we would necessarily have different moral sentiments about anything.
You are not actually being all that informative, since there remains a distinct supsicion that when you say some X is meaningless-to-you, that is a proxy for I-don't-agree-with-it. I notice throughout these discussions that you never reference accepted dictiionary definitions as a basis for meaningfullness, but instead always offer some kind of idiosyncratic personal testimony.
What is wrong with dictionary definitions?
That doesn't affect anything. You still have no proof for the revised version.
Other people out there in the non-existent Objective World?
I don't think moral anti-realists are generally immoral people. I do think it is an intellectual mistake, whether or not you care about that.
Zorg said the same thing about his pan-galactic ethics.
Did you even read the post we're commenting on?
Wait, you want proof that getting what I want is what I care about most?
Read what I wrote again.
Read.