Vladimir_Nesov comments on Timeless Decision Theory: Problems I Can't Solve - Less Wrong

39 Post author: Eliezer_Yudkowsky 20 July 2009 12:02AM

You are viewing a comment permalink. View the original post to see all comments and the full post content.

Comments (153)

You are viewing a single comment's thread. Show more comments above.

Comment author: orthonormal 20 July 2009 05:47:04AM 0 points [-]

What you just described is group selection, and thus highly unlikely.

It's to your individual benefit to be more (unconsciously) selfish and calculating in these situations, whether the other people in your group have a fairness drive or not.

Comment author: Vladimir_Nesov 20 July 2009 11:40:03AM *  2 points [-]

It's to your individual benefit to be more (unconsciously) selfish and calculating in these situations, whether the other people in your group have a fairness drive or not.

Not if you are punished for selfishness. I'm not sure how reasonable the following analysis it (since I didn't study this kind of thing at all); it suggests that fairness is a stable strategy, and given some constraints a more feasible one than selfishness:

M. A. Nowak, et al. (2000). `Fairness versus reason in the ultimatum game.'. Science 289(5485):1773-1775. (PDF)

Comment author: orthonormal 20 July 2009 05:28:45PM 0 points [-]

See reply to Tim Tyler.