Less Wrong is a community blog devoted to refining the art of human rationality. Please visit our About page for more information.

Hans comments on Failed Utopia #4-2 - Less Wrong

52 Post author: Eliezer_Yudkowsky 21 January 2009 11:04AM

You are viewing a comment permalink. View the original post to see all comments and the full post content.

Comments (248)

Sort By: Old

You are viewing a single comment's thread.

Comment author: Hans 21 January 2009 04:16:21PM 0 points [-]

I really hope (perhaps in vain) that humankind will be able to colonize other planets before such a singularity arrives. Frank Herbert's later Dune books have as their main point that a Scattering of humanity throughout space is needed, so that no event can cause the extinction of humanity. An AI that screws up (such as this one) would be such an event.

Comment author: Salivanth 16 April 2012 04:38:47PM 5 points [-]

What makes you think a self-improving super-intelligence gone wrong will be restricted to a single planet?