dxu comments on Conceptual Analysis and Moral Theory - Less Wrong
You are viewing a comment permalink. View the original post to see all comments and the full post content.
You are viewing a comment permalink. View the original post to see all comments and the full post content.
Comments (456)
You get $1000 with 99% probability and $1001000 with 1% probability, for a final expected value of $101090. A one-boxer gets $1000000 with 99% probability and $0 with 1% probability, with a final expected value of $990000. Even with probabilistic uncertainties, you would still have been comparatively better off one-boxing. And this isn't just limited to high probabilities; theoretically any predictive power better than chance causes Newcomb-like situations.
In practice, this tends to go away with lower predictive accuracies because the relative rewards aren't high enough to justify one-boxing. Nevertheless, I have little to no trouble believing that a skilled human predictor can reach accuracies of >80%, in which case these Newcomb-like tendencies are indeed present.