SforSingularity comments on Optimal Strategies for Reducing Existential Risk - Less Wrong

3 Post author: FrankAdamek 31 August 2009 03:52PM

You are viewing a comment permalink. View the original post to see all comments and the full post content.

Comments (21)

You are viewing a single comment's thread. Show more comments above.

Comment author: FrankAdamek 04 September 2009 03:49:53PM 1 point [-]

Off hand I'd think it "very small", as it requires both a future in which people (or their continuations) are around, and that significant power is held by a group (however large) who thinks we should reward and punish people as such, and/or have successfully precommitted to do so.

Comment author: SforSingularity 04 September 2009 04:37:16PM *  0 points [-]

How about the consideration that, out of all good futures that suffer from a tragedy-of-the-commons type problem, those that implement reward/punishment precommitments are more likely to overcome the free rider problem and actually work? Does this not push the probability up somewhat?