Jonathan_Graehl comments on Rationality is Systematized Winning - Less Wrong
You are viewing a comment permalink. View the original post to see all comments and the full post content.
You are viewing a comment permalink. View the original post to see all comments and the full post content.
Comments (252)
Given that I one-box on Newcomb's Problem and keep my word as Parfit's Hitchhiker, it would seem that the rational course of action is to not steer your car even if it crashes (if for some reason winning that game of chicken is the most important thing in the universe).
For Newcomb's Problem, is it fair to say that if you believe the given information, the crux is whether you believe it's possible (for Omega) to have a 99%+ correct prediction of your decision based on the givens? Refusal to accept that seems to me the only justification for two-boxing. Perhaps that's a sign that I'm less tied to a fixed set of "rationalist" procedures than a perfect rationalist would be, but I would feel like I were pretending to say otherwise.
I also wonder if the many public affirmations I've heard of "I would one-box Newcomb's Problem" are attempts at convincing Omega to believe us in the unlikely event of actually encountering the Problem. It does give a similar sort of thrill to "God will rapture me to heaven."