ata comments on What is Eliezer Yudkowsky's meta-ethical theory? - Less Wrong
You are viewing a comment permalink. View the original post to see all comments and the full post content.
You are viewing a comment permalink. View the original post to see all comments and the full post content.
Comments (368)
Your preferences are a utility function if they're consistent, but if you're a human, they aren't.
Consistent in what sense? Utility function over what domain? Under what prior? In this context, some unjustified assumptions, although understandably traditional to a point where objecting is weird.