shminux comments on Is Omega Impossible? Can we even ask? - Less Wrong Discussion
You are viewing a comment permalink. View the original post to see all comments and the full post content.
You are viewing a comment permalink. View the original post to see all comments and the full post content.
Comments (51)
If you allow arbitrarily high but not 100%-accurate predictions (as EY is fond of repeating, 100% is not a probability), the original Newcomb's problem is defined as the limit when prediction accuracy goes to 100%. As noted in other comments, the "winning" answer to the problem is not sensitive to the prediction level just above 50% accuracy (1/(2-1000/1000000), to be precise), so the limiting case must have the same answer.
Damn good point, thanks. That certainly answers my concern about Newcomb's problem.