PhilGoetz comments on Simplified Humanism, Positive Futurism & How to Prevent the Universe From Being Turned Into Paper Clips - Less Wrong

7 Post author: Kevin 22 July 2010 10:03AM

You are viewing a comment permalink. View the original post to see all comments and the full post content.

Comments (43)

You are viewing a single comment's thread. Show more comments above.

Comment author: PhilGoetz 28 July 2010 08:33:03PM 1 point [-]

The goal of development of FAI is reduction of its existential threat to near 0%, by mathematically proving stability and desirability of its preferences. It's OK, but it reminds me of zero-risk bias.

Excellent point. The goal of FAI should be to increase expected value, not to minimize risk.