LucasSloan comments on Morality and relativistic vertigo - Less Wrong

40 Post author: Academian 12 October 2010 02:00AM

You are viewing a comment permalink. View the original post to see all comments and the full post content.

Comments (78)

You are viewing a single comment's thread. Show more comments above.

Comment author: LucasSloan 12 October 2010 03:22:30AM 10 points [-]

I have changed my mind about my values due to noticing that my values were inconsistent.

Comment author: Jayson_Virissimo 12 October 2010 04:36:05AM 5 points [-]

Same here (at least twice).

Comment author: MBlume 13 October 2010 02:55:53AM 2 points [-]

Yeah, but that makes you really really weird.

Comment author: LucasSloan 13 October 2010 03:29:13AM 4 points [-]

For which I am truly grateful.

Comment author: knb 12 October 2010 06:17:56AM *  2 points [-]

First of all, that was intended as a general statement, not an absolute description of every case. Experiments have been done on people to see if, for example, they stop being opposed to incest in fictional scenarios where the incest is stated outright to be harmless.

Before the scenario was presented, people offered utilitarian justifications for the incest taboo, but even when those were stripped away, they insisted that incest is still "just wrong". My point is that this is what generally happens when someone points out incoherency in a moral system. People generally switch to offering an axiomatic rationalization for their moral sentiments instead of a utilitarian one.

Also, I have to say:

I have changed my mind about my values due to noticing that my values were inconsistent.

Do you mean that you made a judgement elevating one value above another you had in cases where they conflict? Or do you mean you actually gained a new value? It seems like you must have used some sort of higher level value preference to make that meta-level moral judgement.

Comment author: LucasSloan 12 October 2010 07:04:31AM 5 points [-]

Do you mean that you made a judgment elevating one value above another you had in cases where they conflict? Or do you mean you actually gained a new value? It seems like you must have used some sort of higher level value preference to make that meta-level moral judgment.

I noticed that my values were inconsistent, and I decided that one of them needed to be expunged. I removed a "value" that had been created at too high a level of abstraction, one which conflicted with the rest of my values and whose actual, important content could be derived from lower level moral concepts.

Comment author: Tyrrell_McAllister 12 October 2010 07:01:30PM *  1 point [-]

First of all, that was intended as a general statement, not an absolute description of every case. Experiments have been done on people to see if, for example, they stop being opposed to incest in fictional scenarios where the incest is stated outright to be harmless.

Before the scenario was presented, people offered utilitarian justifications for the incest taboo, but even when those were stripped away, they insisted that incest is still "just wrong". My point is that this is what generally happens when someone points out incoherency in a moral system. People generally switch to offering an axiomatic rationalization for their moral sentiments instead of a utilitarian one.

Such a person isn't going against Academian's advice. They've been led through the correct procedure of analysis, though they've only gone part of the way. They've found evidence that, all else being equal it's better not to give into a desire to commit incest. The incest itself is what they find bad, not some consequence of it. You haven't identified an incoherence in their final position.

To continue the analysis, they should see what bad consequences would follow from not doing the incest. They should check whether this badness outweighs the badness of doing the incest. They should be able to identify the hypothetical scenarios where it's worse not to commit the incest than to do it.

In the end, they may decide that people shouldn't commit incest in most typical situations, even when there are no distinct bad consequences of the incest. Whether or not you agree with them, they would still be vastly more reflective about their morality than most people are. It would be great if more people were so reflective, even if they ended up disagreeing with you about which things are harms-in-themselves.