Less Wrong is a community blog devoted to refining the art of human rationality. Please visit our About page for more information.

Wei_Dai comments on Some Thoughts on Singularity Strategies - Less Wrong

25 Post author: Wei_Dai 13 July 2011 02:41AM

You are viewing a comment permalink. View the original post to see all comments and the full post content.

Comments (29)

You are viewing a single comment's thread. Show more comments above.

Comment author: Wei_Dai 13 July 2011 05:44:49AM *  6 points [-]

If AGI will take longer than 100 years to become possible, "AI first" isn't a relevant strategic option since an upload or IA driven Singularity will probably occur within that time frame even without any specific push from Singularitarians. So it seems reasonable to set a time horizon of 100 years at most.

Comment author: GuySrinivasan 13 July 2011 06:26:25AM 4 points [-]

Ah okay, so we're talking about a "humans seem just barely smart enough to build a superintelligent UFAI within the next 100 years" intuition. Talking about that makes sense, and that intuition feels much more plausible to me.

Comment author: Vladimir_Nesov 13 July 2011 05:35:02PM 2 points [-]

I'd give it 150 years. Civilization might get a setback, actual implementation of fast-running uploads might be harder than it looks, and intelligence improvement might take too long to become an important force. Plans can fail.