Nornagest comments on Truth vs Utility - Less Wrong Discussion
You are viewing a comment permalink. View the original post to see all comments and the full post content.
You are viewing a comment permalink. View the original post to see all comments and the full post content.
Comments (29)
Yeah, granted that premise and given that maximizing utility may very well involve telling you stuff, option 2 seems to imply one of the following:
The second of these possibilities seems the most compelling; we aren't Friendly in a strong sense. Depending on Omega's idea of your utility function, you can make an argument that maximizing it would be a disaster from a more general perspective, either because you think your utility function's hopelessly parochial and is likely to need modification once we better understand metaethics and fun theory, or because you don't think you're really all that ethical at whatever level Omega's going to be looking at. This latter is almost certainly true, and the former at least seems plausible.