Furcas comments on Could evolution have selected for moral realism? - Less Wrong

2 Post author: John_Maxwell_IV 27 September 2012 04:25AM

You are viewing a comment permalink. View the original post to see all comments and the full post content.

Comments (53)

You are viewing a single comment's thread. Show more comments above.

Comment author: Furcas 27 September 2012 05:27:04PM *  5 points [-]

Most of the LWers who voted for moral realism probably believe that Eliezer's position about morality is correct, and he says that morality is subjunctively objective. It definitely fits Wikipedia's definition of moral realism:

Moral realism is the meta-ethical view which claims that:

  • Ethical sentences express propositions.
  • Some such propositions are true.
  • Those propositions are made true by objective features of the world, independent of subjective opinion.
Comment author: DanArmak 27 September 2012 10:49:35PM 4 points [-]

To the best of my understanding, "subjunctively objective" means the same thing that "subjective" means in ordinary speech: dependent on something external, and objective once that something is specified. So Eliezer's morality is objective once you specify that it's his morality (or human morality, etc.) and then propositions about it can be true or false. "Turning a person into paperclips is wrong" is an ethical proposition that is Eliezer-true and Human-true and Paperclipper-false, and Eliezer's "subjunctive objective" view is that we should just call that "true".

I disagree with that approach because this is exactly what is called being "subjective" by most people, and so it's misleading. As if the existing confusion over philosophical word games wasn't bad enough.

Comment author: Spinning_Sandwich 30 September 2012 08:30:08PM 0 points [-]

"Turning a person into paperclips is wrong" is an ethical proposition that is Eliezer-true and Human-true and >Paperclipper-false, and Eliezer's "subjunctive objective" view is that we should just call that "true".

Despite the fact that we might have a bias toward the Human-[x] subset of moral claims, it's important to understand that such a theory does not itself favor one over the other.

It would be like a utilitarian taking into account only his family's moral weights in any calculations, so that a moral position might be Family-true but Strangers-false. It's perfectly coherent to restrict the theory to a subset of its domain (and speaking of domains, it's a bit vacuous to talk of paperclip morality, at least to the best of my knowledge of the extent of their feelings...), but that isn't really what the theory as a whole is about.

So if we as a species were considering assimilation, and the moral evaluation of this came up Human-false but Borg-true, the theory (in principle) is perfectly well equipped to decide which would ultimately be the greater good for all parties involved. It's not simply false just because it's Human-false. (I say this, but I'm unfamiliar with Eliezer's position. If he's biased toward Human-[x] statements, I'd have to disagree.)

Comment author: Furcas 28 September 2012 01:11:49AM *  -2 points [-]

I disagree with that approach because this is exactly what is called being "subjective" by most people

Those same people are badly confused, because they usually believe that if ethical propositions are "subjective", it means that the choice between them is arbitrary. This is an incoherent belief. Ethical propositions don't become objective once you specify the agent's values; they were always objective, because we can't even think about an ethical proposition without reference to some set of values. Ethical propositions and values are logically glued together, like theorems and axioms.

You could say that the concept of something being subjective is itself a confusion, and that all propositions are objective.

That said, I share your disdain for philosophical word games. Personally, I think we should do away with words like 'moral' and 'good', and instead only talk about desires and their consequences.

Comment author: Matt_Simpson 27 September 2012 06:08:51PM 2 points [-]

This is why I voted for moral realism. If instead Moral realism is supposed to mean something stronger, then I'm probably not a moral realist.

Comment author: J_Taylor 28 September 2012 02:02:34AM 1 point [-]

The entire issue is a bit of a mess.

http://plato.stanford.edu/entries/moral-anti-realism/