Lumifer comments on Open Thread, Apr. 20 - Apr. 26, 2015 - Less Wrong
You are viewing a comment permalink. View the original post to see all comments and the full post content.
You are viewing a comment permalink. View the original post to see all comments and the full post content.
Comments (350)
I've come up with an interesting thought experiment I call oracle mugging.
An oracle comes up to you and tells you that either you will give them a thousand dollars or you will die in the next week. They refuse to tell you which. They have done this many times, and everyone has either given them money or died. The oracle isn't threatening you. They just go around and find people who will either give them money or die in the near future, and tell them that.
Should you pay the oracle? Why or why not?
It's just a version of the Newcomb's problem with negative outcomes instead of positive.
Presumably the oracle makes its offer only to people from two classes: (1) Those who will die next week AND will not pay $1000; and (2) Those who will pay $1000 AND not die next week. Since it's the oracle it can identify these people and make its offer only to them. If you got this offer, you are in one of the above classes but you "don't know" in which.