blacktrance comments on Strong moral realism, meta-ethics and pseudo-questions. - Less Wrong

18 [deleted] 31 January 2010 08:20PM

You are viewing a comment permalink. View the original post to see all comments and the full post content.

Comments (172)

You are viewing a single comment's thread. Show more comments above.

Comment author: blacktrance 29 May 2014 04:42:01PM 0 points [-]

An objective morality machine would tell you what you should do, not tell you how to satisfy your values

Why must the two be mutually exclusive? Why can't morality be about satisfying your values? One could say that morality properly understood is nothing more than the output of decision theory, or that outputs of decision theory that fall in a certain area labeled "moral questions" are morality.

Comment author: komponisto 29 May 2014 07:24:30PM 1 point [-]

Why can't morality be about satisfying your values?

Because that isn't how the term "morality" is typically used by humans. The "morality police" found in certain Islamic countries aren't life coaches. The Ten Commandments aren't conditional statements. When people complain about the decaying moral fabric of society, they're not talking about a decline in introspective ability.

Inherent to the concept of morality is the external imposition of values. (Not just decisions, because they also want you to obey the rules when they're not looking, you see?) Sociologically speaking, morality is a system for getting people to do unfun things by threatening ostracization.

Decision theory (and meta-decision-theory etc.) does not exist to analyze this concept (which is not designed for agents); it exists to replace it.

Comment author: bogus 31 May 2014 02:59:32PM *  0 points [-]

Because that isn't how the term "morality" is typically used by humans. The "morality police" found in certain Islamic countries aren't life coaches. The Ten Commandments aren't conditional statements. ... Inherent to the concept of morality is the external imposition of values.

Morality is about all of these things. and more besides. Although "outer" morality as embodied in moral codes and moral exemplars is definitely important, if there were no inner values for humans to care about in the first place, no one would be going around and imposing them on others, or even debating them in any way.

And it is a fact about the world that most basic moral values are shared among human societies. Morality may or may not be objective, but it is definitely intersubjective in a way that looks 'objective' to the casual observer.

Comment author: blacktrance 29 May 2014 08:39:47PM *  0 points [-]

"Morality" is used by humans in unclear ways and I don't know how much can be gained from looking at common usage. It's more sensible to look at philosophical ethical theories rather than folk morality - and there you'll find that moral internalism and ethical egoism are within the realm of possible moralities.

Comment author: TheAncientGeek 29 May 2014 08:34:18PM *  0 points [-]

Morality done right is about the voluntary and mutual adjustment of values ( or rather actions expressing them).

Morally done wrong can go two ways, one failure mode is hedonism, where the individual takes no notice of the preferences of others:; the other is authoritarianism, where "society" (rather, its representatives) imposes values that no-one likes or has a say in.

Comment author: TheAncientGeek 29 May 2014 04:53:09PM *  -1 points [-]

Note the word objective.

Comment author: blacktrance 29 May 2014 05:05:21PM *  0 points [-]

An objective morality machine would tell you the One True Objective Thing TheAncientGeek Should Do, given your values, but this thing need not be the same as The One True Objective Thing Blacktrance Should Do. The calculations it performs are the same in both cases (which is what makes it objective), but the outputs are different.

Comment author: TheAncientGeek 29 May 2014 06:22:11PM 0 points [-]

You are misusing "objective". How does your usage differ from telling me what i should do subjectively? How can.true-for-me-but-not-for-you clauses fail to indicate subjectivity? How cam it be coherent to say there is one truth, only it is different for everybody?

Comment author: Vaniver 29 May 2014 06:44:56PM 3 points [-]

A person's height is objectively measurable; that does not mean all people have the same height.

Comment author: TheAncientGeek 29 May 2014 07:01:33PM 0 points [-]

"True about person P" is objective.

"True for person P about X" is subjective.

Subjectivity is multiple truths about one thing, ie multiple claims about one thing, which are indexed to individuals, and which would be contradictory without the indexing.

Comment author: Vaniver 29 May 2014 07:28:48PM 1 point [-]

In this discussion, I understand there to be three positions:

  1. There is one objectively measurable value system.
  2. There is an objectively measurable value system for each agent.
  3. There are not objectively measurable value systems.

The 'objective' and 'subjective' distinction is not particularly useful for this discussion, because it confuses the separation between 'measurable' and 'unmeasurable' (1+2 vs. 3) and 'universal' and 'particular' (1 vs. 2+3).

But even 'universal' and 'particular' are not quite the right words- Clippy's particular preference for paperclips is one that Clippy would like to enforce on the entire universe.

Comment author: komponisto 29 May 2014 07:40:35PM *  1 point [-]

No one holds 3. 1 is ambiguous; it depends on whether we're speaking "in character" or not. If we are, then it follows from 2 ("there is one objectively measurable value system, namely mine").

The trouble with Eliezer's "metaethics" sequence is that it's written in character (as a human), and something called "metaethics" shouldn't be.

Comment author: Vaniver 29 May 2014 07:50:31PM *  1 point [-]

No one holds 3.

It is not obvious to me that this is the case.

[edit to expand]: I think that when a cognitivist claims "I'm not a relativist," they need to have a position like 3 to identify as relativism. Perhaps it is an overreach to use 'value system' instead of 'morality' in the description of 3, which was a choice driven more by my allergy to the word 'morality' than to be correct or communicative.

1 is ambiguous; it depends on whether we're speaking "in character" or not. If we are, then it follows from 2 ("there is one objectively measurable value system, namely mine").

One could be certain that God's morality is correct, but be uncertain what God's morality is.

The trouble with Eliezer's "metaethics" sequence is that it's written in character (as a human), and something called "metaethics" shouldn't be.

I agree with this assessment.

Comment author: TheAncientGeek 29 May 2014 10:02:16PM *  -1 points [-]

Yes. He's has strong intuitions that his own moral intuitions are really true, combined with strong intuitions that,morality is this very localized .human thing,, that doesn't exist elsewhere. So he defines morality as what humans.think morality is...what I dont know isn't knowledge.

Comment author: nshepperd 30 May 2014 02:20:18AM 0 points [-]

The trouble with Eliezer's "metaethics" sequence is that it's written in character (as a human), and something called "metaethics" shouldn't be.

People always write in character. If you try to use some different definition of "morality" than normal for talking about metaethics, you'll reach the wrong conclusions because, y'know, you're quite literally not talking about morality any more.

Comment author: komponisto 30 May 2014 04:59:04AM 0 points [-]

Language is different from metalanguage, even if both are (in) English.

You shouldn't be using any definition of "morality" when talking about metaethics, because on that level the definition of "morality" isn't fixed; that's what makes it meta.

My complaint about the sequence is that it should have been about the orthogonality thesis, but instead ended up being about rigid designation.

Comment author: blacktrance 29 May 2014 07:23:11PM 0 points [-]

Subjectivity is multiple truths about one thing

Agreed. What I should do is a separate thing from what you should do, even though they're the same type of thing and may be similar in many ways.

Comment author: TheAncientGeek 29 May 2014 08:11:25PM *  0 points [-]

What you morally should do to me has to take me into account, and vice versa. Otherwise you are using morality to mean hedonism.

Comment author: blacktrance 29 May 2014 08:50:39PM *  0 points [-]

In one sense, this is trivial. I have to take you into account when I do something to you, just like I have to take rocks into account when I do something to them. You're part of a state of the world. (It may be the case that after taking rocks into account, it doesn't affect my decision in any way. But my decision can still be formulated as taking rocks into account.)

In another sense, whether I should take your well-being into account depends on my values. If I'm Clippy, then I shouldn't. If I'm me, then I should.

Otherwise you are using morality to mean hedonism.

Hedonism makes action-guiding claims about what you should do, so it's a form of morality, but it doesn't by itself mean that I shouldn't take you into account - it only means that I should take your well-being into account instrumentally, to the degree it gives me pleasure. Also, the fulfillment of one's values is not synonymous with hedonism. A being incapable of experiencing pleasure, such as some form of Clippy, has values but acting to fulfill them would not be hedonism.

Comment author: TheAncientGeek 29 May 2014 09:45:35PM 0 points [-]

Whether or not or you morally-should take me into account does not depend on your values, it depends on what the correct theory of morality is. "Should" is not an unambiguous term with a free variable for " to whom". It is an ambiguous term, and morally-should is not hedonistically-should, is not practically-should....etc.

Comment author: blacktrance 29 May 2014 06:44:43PM 0 points [-]

Saying it's true-for-me-but-not-for-you conflates two very different things: truth being agent-relative and descriptive statements about agents being true or false depending on the agent they're referring to. "X is 6 feet tall" is true when X is someone who's 6 feet tall and false when X is someone who's 4 feet tall, and in neither case is it subjective, even though the truth-value depends on who X is. Morality is similar - "X is the right thing for TheAncientGeek to do" is an objectively true (or false) statement, regardless of who's evaluating you. Encountering "X is the right thing to do if you're Person A and the wrong thing to do if you're Person B" and thinking moralitry subjective is the same sort of mistake as if you encountered the statement "Person A is 6 feet tall and Person B is not 6 feet tall" and concluded that height is subjective.

Comment author: TheAncientGeek 29 May 2014 07:12:13PM 0 points [-]

See my other reply.

Indexing statements about individuals to individuals is harmless. Subjectivity comes in when you index statements about something else to individuals.

Morally relevant actions are actions which potentially affect others

Your morality machine is subjective because I don't need to feed in anyone else's preferences, even though my actions will affect them.

Comment author: blacktrance 29 May 2014 07:24:29PM 0 points [-]

Other people's preferences are part of states of the world, and states of the world are fed into the machine.

Comment author: TheAncientGeek 29 May 2014 07:44:17PM 0 points [-]

Not part of the original spec!!!

Comment author: blacktrance 29 May 2014 08:02:51PM 0 points [-]

Fair enough. In that case, the machine would tell you something like "Find out expected states of the world. If it's A, do X. If it's B, do Y".

Comment author: TheAncientGeek 29 May 2014 09:34:55PM -1 points [-]

It may well, but that' is a less interesting and comtentious claim. It's fairly widely accepted that the sum total of ethi.cs is inferrable from (supervenes on) the sum total of facts.

Comment author: komponisto 29 May 2014 07:10:41PM *  0 points [-]

Morality is similar - "X is the right thing for TheAncientGeek to do" is an objectively true (or false) statement, regardless of who's evaluating you.

Not so! Rather, "X is the right thing for TheAncientGeek to do given TheAncientGeek's values" is an objectively true (or false) statement. But "X is the right thing for TheAncientGeek to do" tout court is not; it depends on a specific value system being implicitly understood.

Comment author: blacktrance 29 May 2014 08:30:50PM *  0 points [-]

"X is the right thing for TheAncientGeek to do" is synonymous with "X is the right thing for TheAncientGeek to do according to his (reflectively consistent) values". You may not want him to act in accordance with his values, but that doesn't change the fact that he should - much like in the standard analysis of the prisoner's dilemma, each prisoner wants the other to cooperate, but has to admit that each of them should defect.

Comment author: TheAncientGeek 29 May 2014 07:26:26PM -1 points [-]

Same mistake, Only actions that affect others are morally relevant, from which it follows that rightness cannot be evaluated from one person's values alone.

Maximizing ones values solipsitically is hedonism, not morality.

Comment author: komponisto 29 May 2014 07:28:54PM 0 points [-]

Notice I didn't use the term "morality" in the grandparent. Cf. my other comment.

Comment author: TheAncientGeek 29 May 2014 08:24:06PM 0 points [-]

But the umpteenth grandparent was explicitly about morality.