Tim_Tyler comments on Singletons Rule OK - Less Wrong

11 Post author: Eliezer_Yudkowsky 30 November 2008 04:45PM

You are viewing a comment permalink. View the original post to see all comments and the full post content.

Comments (46)

Sort By: Old

You are viewing a single comment's thread.

Comment author: Tim_Tyler 01 December 2008 07:24:50PM 3 points [-]

AI projects that say they plan to have their AI take over the world could induce serious and harmful conflict

So, is this better or worse than the eternal struggle you propose? Superintelligent agents nuking it out on the planet in a struggle for the future may not be fun - and yet your proposals seem to promote and prolong that stage - rather than getting it over with as quickly as possible. It seems as though your proposal comes off looking much worse in some respects - e.g. if you compare the if you compare total number of casualties. Are you sure that is any better? If so, what makes you think that?