You're looking at Less Wrong's discussion board. This includes all posts, including those that haven't been promoted to the front page yet. For more information, see About Less Wrong.

Ixiel comments on Open thread, 7-14 July 2014 - Less Wrong Discussion

2 Post author: David_Gerard 07 July 2014 07:14AM

You are viewing a comment permalink. View the original post to see all comments and the full post content.

Comments (232)

You are viewing a single comment's thread. Show more comments above.

Comment author: ike 13 July 2014 08:24:32PM *  4 points [-]

I spoke with someone recently who asserted that they would prefer an 100% chance of getting a dollar, than a 99% chance of getting $1,000,000. Now, I don't think that they would actually do this if the situation was real, i.e. if they had $1,000,000 and there was a 1 in 100 chance that it would be lost, they wouldn't pay someone $999,999 to do away with that probability and therefore guarantee them the $1, but they think they would do that. I'm interested in what could cause someone to think that. I actually have a little more information upon asking a few more questions, but I'd like to see what others think without knowing the answer.

My own thoughts:This may be related to the Allais paradox. It also trivially implies two-boxing in Newcomb.

Some more questions raised:

What arguments might I make to change this person's mind?

Would it be ethical, if I had to make this choice for them, to choose the $1,000,000? What about an AI making choices for a human with this utility function?

Comment author: Ixiel 14 July 2014 11:31:43AM 0 points [-]

Writing it backward, I thinks you just did.

As for the ethics, if you already were in a position to HAVE to make the decision, you should do what you think is right regardless of any of their prior opinions. If, however, you just had the opportunity to override them, I thinks you should limit yourself to persuading as many of them as you can, but not override them for their own benefits.