You're looking at Less Wrong's discussion board. This includes all posts, including those that haven't been promoted to the front page yet. For more information, see About Less Wrong.

lessdazed comments on Intelligence Explosion analysis draft: introduction - Less Wrong Discussion

1 Post author: lukeprog 14 November 2011 09:50AM

You are viewing a comment permalink. View the original post to see all comments and the full post content.

Comments (12)

You are viewing a single comment's thread.

Comment author: lessdazed 15 November 2011 05:37:07PM 1 point [-]

or most future scenarios

...most plausible future scenarios. (?) I would take out "or most".

Shortly thereafter, we may see an “intelligence explosion” or “technological Singularity” — a chain of events by which human-level AI leads, fairly rapidly, to intelligent systems whose capabilities far surpass those of biological humanity as a whole (Chalmers 2010)...Finally, we discuss the possible consequences of an intelligence explosion and which actions we can take now to influence those results.

Is the idea of a "technological Singularity" different than a combination of predictions about technology and predictions about its social and political effects? An intelligence explosion could be followed by little changing, if for example all human created AIs tended to become the equivalent of ascetic monks. That being so, I would start with the technological claims and make them the focus by not emphasizing the "Singularity" aspect, a Singularity being a situation after which the future will be very different than before.