TheAncientGeek comments on Strong moral realism, meta-ethics and pseudo-questions. - Less Wrong

18 [deleted] 31 January 2010 08:20PM

You are viewing a comment permalink. View the original post to see all comments and the full post content.

Comments (172)

You are viewing a single comment's thread. Show more comments above.

Comment author: blacktrance 29 May 2014 06:44:43PM 0 points [-]

Saying it's true-for-me-but-not-for-you conflates two very different things: truth being agent-relative and descriptive statements about agents being true or false depending on the agent they're referring to. "X is 6 feet tall" is true when X is someone who's 6 feet tall and false when X is someone who's 4 feet tall, and in neither case is it subjective, even though the truth-value depends on who X is. Morality is similar - "X is the right thing for TheAncientGeek to do" is an objectively true (or false) statement, regardless of who's evaluating you. Encountering "X is the right thing to do if you're Person A and the wrong thing to do if you're Person B" and thinking moralitry subjective is the same sort of mistake as if you encountered the statement "Person A is 6 feet tall and Person B is not 6 feet tall" and concluded that height is subjective.

Comment author: TheAncientGeek 29 May 2014 07:12:13PM 0 points [-]

See my other reply.

Indexing statements about individuals to individuals is harmless. Subjectivity comes in when you index statements about something else to individuals.

Morally relevant actions are actions which potentially affect others

Your morality machine is subjective because I don't need to feed in anyone else's preferences, even though my actions will affect them.

Comment author: blacktrance 29 May 2014 07:24:29PM 0 points [-]

Other people's preferences are part of states of the world, and states of the world are fed into the machine.

Comment author: TheAncientGeek 29 May 2014 07:44:17PM 0 points [-]

Not part of the original spec!!!

Comment author: blacktrance 29 May 2014 08:02:51PM 0 points [-]

Fair enough. In that case, the machine would tell you something like "Find out expected states of the world. If it's A, do X. If it's B, do Y".

Comment author: TheAncientGeek 29 May 2014 09:34:55PM -1 points [-]

It may well, but that' is a less interesting and comtentious claim. It's fairly widely accepted that the sum total of ethi.cs is inferrable from (supervenes on) the sum total of facts.