PhilGoetz comments on Simplified Humanism, Positive Futurism & How to Prevent the Universe From Being Turned Into Paper Clips - Less Wrong
You are viewing a comment permalink. View the original post to see all comments and the full post content.
You are viewing a comment permalink. View the original post to see all comments and the full post content.
Comments (43)
The goal of development of FAI is reduction of its existential threat to near 0%, by mathematically proving stability and desirability of its preferences. It's OK, but it reminds me of zero-risk bias.
How do you think designing and recommending containment system for AGIs will lower existential risks? Compare with condoms.
Excellent point. The goal of FAI should be to increase expected value, not to minimize risk.