Matt_Simpson comments on What is Eliezer Yudkowsky's meta-ethical theory? - Less Wrong

33 Post author: lukeprog 29 January 2011 07:58PM

You are viewing a comment permalink. View the original post to see all comments and the full post content.

Comments (368)

You are viewing a single comment's thread. Show more comments above.

Comment author: lukeprog 30 January 2011 06:23:33PM 2 points [-]

Dorikka,

If that's what Eliezer means, then this looks like standard practical rationality theory. You have reasons to act (preferences) so as to maximize your utility function (except that it may not be right to call it a "utility function" because there's no guarantee that each person's preference set is logically consistent). The fact that you want other people to satisfy their preferences, too, means that if enough other people want world-state X, your utility function will assign higher utility to world-state X than to world-state Y even if world-state Y has more utility in your utility function when not counting the utility in your utility function assigned to the utility functions of other people.

But I don't think that's all of what Eliezer is saying because, for example, he keeps talking about the significance of a test showing that you would be okay being hit with an alien ray gun that changed your ice cream preference from chocolate to vanilla, but you wouldn't be okay being hit with an alien ray gun that changed your preferences from not-wanting-to-rape-people to wanting-to-rape-people.

He also writes about the importance of a process of reflective equilibrium, though I'm not sure to what end.

Comment author: Matt_Simpson 30 January 2011 11:06:27PM 1 point [-]

He also writes about the importance of a process of reflective equilibrium, though I'm not sure to what end.

To handle value uncertainty. If you don't know your terminal values, you have to discover them somehow.

Comment author: lukeprog 30 January 2011 11:14:19PM 1 point [-]

Is that it? Eliezer employs reflective equilibrium as an epistemological method for figuring out what your terminal values are?

Comment author: Eugine_Nier 30 January 2011 11:35:40PM 2 points [-]

As I understand it, yes.

Comment author: Matt_Simpson 30 January 2011 11:39:56PM *  0 points [-]

Or at least how to balance between them. Though there might be more to it than that.

edit: more precisely (in EY's terms), to figure out how to balance the various demands of morality which, as it happens, is included in your terminal values.