shokwave comments on Two questions about CEV that worry me - Less Wrong

29 Post author: cousin_it 23 December 2010 03:58PM

You are viewing a comment permalink. View the original post to see all comments and the full post content.

Comments (137)

You are viewing a single comment's thread.

Comment author: shokwave 25 December 2010 01:47:14PM *  1 point [-]

For 1), the sense I got was that it assumes no progress, and furthermore that if you perform an extrapolation that pleases 21st century Americans but would displease Archimedes or any other random Syracusan, your extrapolation-bearing AGI is going to tile the universe with American flags or episodes of Seinfield.

For 2), it feels a No True Scotsman issue. If by some definition of current, personal volition you exclude anything that isn't obviously a current, personal desire by way of deeming it insincere, then you've just made your point tautological. Do you believe it's possible in principle to sincerely desire to build an AI that implements not your volition, but the coherent extrapolated volition of humanity?

If so, you should not express any incredulity that Eliezer expresses such a desire; it may be a mutant desire but only mutants who desire it would express it.

Comment author: Vaniver 25 December 2010 03:29:02PM 4 points [-]

your extrapolation-bearing AGI is going to tile the universe with American flags

You're right, that would be terrible. They should be Texan flags.

I think there other failure modes that are significant- for example, a world where women are given full moral weight and autonomy would probably be terrifying to someone whose society is centered around women being the most valuable property there is (for both men and women, I imagine- abuse leaves quite the mark on minds).

Comment author: shokwave 25 December 2010 03:52:13PM -1 points [-]

Exactly. The desired case is where there are no failure modes; the CEV seems like it should logically have no failure modes that it can't avoid, and that any it can't avoid cannot be avoided by any system.