Furcas comments on Strong moral realism, meta-ethics and pseudo-questions. - Less Wrong

18 [deleted] 31 January 2010 08:20PM

You are viewing a comment permalink. View the original post to see all comments and the full post content.

Comments (172)

You are viewing a single comment's thread. Show more comments above.

Comment author: byrnema 31 January 2010 09:43:18PM *  4 points [-]

I thought there was no way I could ever understand what Eliezer had written, but you've provided a clue. Should I translate this:

Morality is about how to save babies, not eat them, everyone knows that and they happen to be right. If we could get past difficulties of the translation, the babyeaters would agree with us about what is moral, we would agree with them about what is babyeating, and we would agree about the physical fact that we find different sorts of logical facts to be compelling.

as this?

Human-morality is about how to save babies, not eat them, everyone knows that and they happen to be right. If we could get past difficulties of the translation, the babyeaters would agree with us about what is human-moral, we would agree with them about what is babyeating-moral, and we would agree about the physical fact that we find different sorts of logical facts to be compelling.

Also, what was especially perplexing, translate:

"What should be done with the universe" invokes a criterion of preference, "should", which compels humans but not Babyeaters. If you look at the fact that the Babyeaters are out trying to make a different sort of universe [...] They do the babyeating thing, we do the right thing;

as:

"What should be done with the universe" invokes a criterion of preference, "human-should", which compels humans but not Babyeaters. If you look at the fact that the Babyeaters are out trying to make a different sort of universe [...] They do the babyeating-right thing, we do the human-right thing; ?

Comment author: Furcas 31 January 2010 10:18:32PM *  3 points [-]

Should I translate this: [...] as this? [...]

Yes.

Also, what was especially perplexing, translate: "[...] as: [...] ?

Yes!

Comment author: Eliezer_Yudkowsky 31 January 2010 11:14:15PM 1 point [-]

No. See other replies.

Comment author: Furcas 01 February 2010 12:05:39AM *  4 points [-]

I understand and agree with your point that the long list of terminal values that most humans share aren't the 'right' ones because they're values that humans have. If Omega altered the brain of every human so that we had completely different values, 'morality' wouldn't change.

Therefore, to be perfectly precise, byrnema would have to edit her comment to substitute the long list of values that humans happen to share for the word 'human', and the long list of values that Babyeaters happen to share for the word 'babyeating'.

So yeah, I get why someone who doesn't want to create this kind of confusion in his interlocutors would avoid saying "human-right" and "human-moral". The problem is that you're creating another kind of confusion.

Comment author: byrnema 01 February 2010 12:37:40AM 1 point [-]

If Omega altered the brain of every human so that we had completely different values, 'morality' wouldn't change.

Is this because morality is reserved for a particular list - the list we currently have -- rather than a token for any list that could be had?

Comment author: Furcas 01 February 2010 12:49:32AM 2 points [-]

It's because [long list of terminal values that current humans happen to share]-morality is defined by the long list of terminal values that current humans happen to share. It's not defined by the list of terminal values that post-Omega humans would happen to have.

Is arithmetic "reserved for" a particular list of axioms or for a token for any list of axioms? Neither. Arithmetic is its axioms and all that can be computed from them.

Comment author: nolrai 02 February 2010 10:22:59PM 3 points [-]

See I think you miss understanding his response. I mean that is the only way I can interpret it to make sense.

Your insistence that it is not the right interpretation is very odd. I get that you don't want to trigger peoples cooperation instincts, but thats the only framework in which talking about other beings makes sense.

The morality you are talking about is the human-now-extended morality, (well closer to the less-wrong-now-extended morality) in that it is the morality that results from extending from the values humans currently have. Now you seem to have a categorization that need to categorize your own morality as different from others in order to feel right about imposing it? So you categorize it as simply morality, but your morality is is not necessarily my morality and so that categorization feels iffy to me. Now its certainly closer to mine then to the baby eaters, but I have no proof it is the same. Calling it simply Morality papers over this.

Comment author: Rain 09 February 2010 07:53:28PM *  1 point [-]

You're wrong. Despite how much I'd like to have a universal, ultimate, true morality, you can't create it out of whole cloth by defining it as "what-humans-value". That's pretending there's no reason to look up, because, "Look! It's right there in front of you. So be sure not to look up."