Will_Newsome comments on Can the Chain Still Hold You? - Less Wrong

108 Post author: lukeprog 13 January 2012 01:28AM

You are viewing a comment permalink. View the original post to see all comments and the full post content.

Comments (354)

You are viewing a single comment's thread. Show more comments above.

Comment author: wedrifid 10 March 2012 08:13:57PM 2 points [-]

I think this is a good reductio of many meta-level non-moral-realist FAI approaches like CEV. They retrospectively endorse genocide.

I've had thoughts along similar lines myself. However I must point out that it isn't CEV that is retrospectively endorsing genocide so much as it is the hypothetical people who commit genocide prior to having their CEV calculated that are (evidently) endorsing genocide. Yes, extrapolating the volition of folks who are into genocide (that you don't approve of) is a bad idea. It is rather critical just which set of agents you plug into a CEV algorithm!

Comment author: Will_Newsome 10 March 2012 09:10:08PM 4 points [-]

It is rather critical just which set of agents you plug into a CEV algorithm!

I take this (very real) possibility as strongly indicating that CEV-like approaches are insufficiently meta and that we should seriously expend a lot of effort on (getting closer to) solving moral philosophy if at all possible. (Or alternatively, as Wei Dai likes to point out, solving metaphilosophy.)

Comment author: TheOtherDave 11 March 2012 12:05:46AM 1 point [-]

Sure.

Put slightly differently: if I have some set of ethical standards S against which I'm prepared to compare the results R of a CEV-like algorithm, with the intention of discarding R where R conflicts with S, it follows that I consider wherever I got S from a more reliable source of ethical judgments than I consider CEV. If so, that strongly suggests that if I want reliable ethical judgments, what I ought to be doing is exploring the source of S.

Conversely, if I believe a CEV-like algorithm is a more reliable source of ethical judgments than anything else I have available, then I ought to be willing to discard S where it conflicts with R.