dxu comments on Conceptual Analysis and Moral Theory - Less Wrong

60 Post author: lukeprog 16 May 2011 06:28AM

You are viewing a comment permalink. View the original post to see all comments and the full post content.

Comments (456)

You are viewing a single comment's thread. Show more comments above.

Comment author: dxu 19 November 2014 02:52:18AM *  2 points [-]

You get $1000 with 99% probability and $1001000 with 1% probability, for a final expected value of $101090. A one-boxer gets $1000000 with 99% probability and $0 with 1% probability, with a final expected value of $990000. Even with probabilistic uncertainties, you would still have been comparatively better off one-boxing. And this isn't just limited to high probabilities; theoretically any predictive power better than chance causes Newcomb-like situations.

In practice, this tends to go away with lower predictive accuracies because the relative rewards aren't high enough to justify one-boxing. Nevertheless, I have little to no trouble believing that a skilled human predictor can reach accuracies of >80%, in which case these Newcomb-like tendencies are indeed present.