MugaSofer comments on Rationality Quotes January 2013 - Less Wrong

6 Post author: katydee 02 January 2013 05:23PM

You are viewing a comment permalink. View the original post to see all comments and the full post content.

Comments (604)

You are viewing a single comment's thread. Show more comments above.

Comment author: Kawoomba 15 January 2013 06:30:33PM 2 points [-]

I have a certain probability I ascribe to the belief that god always tells the truth, let's say this is very high.

I also have a certain probability with which I believe that CEV_(Kawoomba) contained such a command. This is negligible because (from the definition) it certainly doesn't fit with "were more the [man] [I] wished [I] were".

However, we can lay that argument (evening out between a high and a very low probability) aside, there's a more important one:

The point is that my values are not CEV_(Kawoomba), which is a concept that may make sense for an AI to feed with, or even to personally aspire to, but is not self-evidently a concept we should unequivocally aspire to. In a conflict between my values and some "optimized" (in whatever way) values that I do not currently have but that may be based on my current values, guess which ones win out? (My current ones.)

That aside, there is no way that the very foundation of my values could be turned topsy turvy and still fit with CEV's mandate of "being the person I want to be".