You're looking at Less Wrong's discussion board. This includes all posts, including those that haven't been promoted to the front page yet. For more information, see About Less Wrong.

MugaSofer comments on Snowdenizing UFAI - Less Wrong Discussion

5 Post author: JoshuaFox 05 December 2013 02:42PM

You are viewing a comment permalink. View the original post to see all comments and the full post content.

Comments (71)

You are viewing a single comment's thread. Show more comments above.

Comment author: MugaSofer 09 December 2013 05:34:33PM 1 point [-]

You estimate the government might press ahead even with 9% probability of extinction. If every competing government takes on a different risk of this magnitude - perhaps a risk of their own personal failure that is really independent of competitors, as with the risk of releasing an AI that turns out to be Unfriendly - then with 10 such projects we have 90% total probability of the extinction of all life.

Um, hypothetically, once the first SIAI is released (Friendly or not) it isn't going to give the next group a go.

Only the odds on the first one to be released matter, so they can't multiply to a 90% risk.

</nitpick>

With that said, you're right that it would be a good thing for governments to take existential risks seriously, just like it would be a good thing for pretty much everyone to take them seriously, ya?