wedrifid comments on Gains from trade: Slug versus Galaxy - how much would I give up to control you? - Less Wrong

33 Post author: Stuart_Armstrong 23 July 2013 07:06PM

You are viewing a comment permalink. View the original post to see all comments and the full post content.

Comments (67)

You are viewing a single comment's thread. Show more comments above.

Comment author: wedrifid 22 July 2013 05:42:33PM 3 points [-]

human desires can be meaningfully mapped into something like a utility function

I don't believe this is possible in useful way.

Do you mean not possible for humans with current tools or theoretically impossible? (It seems to me that in principle human preferences can be mapped to something like a utility function in a way that is at least useful, even if not ideal.)

Comment author: Stuart_Armstrong 22 July 2013 06:06:22PM 3 points [-]

That's a whole conversation! I probably shouldn't start talking about this, since I don't have the time to do it justice.

In the main, I feel that humans are not easily modelled by a utility function, but we have meta-preferences that cause us to hate facing the kind of trade-offs that utility functions imply. I'd bet most people would pay to not have their preferences replaced with a utility function, no matter how well defined it was.

But that's a conversation for after the baby!