Nornagest comments on Truth vs Utility - Less Wrong

1 Post author: Qwake 13 August 2014 05:45AM

You are viewing a comment permalink. View the original post to see all comments and the full post content.

Comments (29)

You are viewing a single comment's thread. Show more comments above.

Comment author: Nornagest 20 August 2014 07:37:41PM *  0 points [-]

Yeah, granted that premise and given that maximizing utility may very well involve telling you stuff, option 2 seems to imply one of the following:

  • you don't trust Omega
  • you don't trust your utility function
  • you have objections (other than trust) to accepting direct help from an alien supercomputer

The second of these possibilities seems the most compelling; we aren't Friendly in a strong sense. Depending on Omega's idea of your utility function, you can make an argument that maximizing it would be a disaster from a more general perspective, either because you think your utility function's hopelessly parochial and is likely to need modification once we better understand metaethics and fun theory, or because you don't think you're really all that ethical at whatever level Omega's going to be looking at. This latter is almost certainly true, and the former at least seems plausible.