Less Wrong is a community blog devoted to refining the art of human rationality. Please visit our About page for more information.

bsterrett comments on Permitted Possibilities, & Locality - Less Wrong

11 Post author: Eliezer_Yudkowsky 03 December 2008 09:20PM

You are viewing a comment permalink. View the original post to see all comments and the full post content.

Comments (21)

Sort By: Old

You are viewing a single comment's thread.

Comment author: bsterrett 26 October 2012 02:55:37PM 2 points [-]

I recently read the wiki article on criticality accidents, and it seems relevant here. "A criticality accident, sometimes referred to as an excursion or a power excursion, is the unintentional assembly of a critical mass of a given fissile material, such as enriched uranium or plutonium, in an unprotected environment."

Assuming Eliezer's analysis is correct, we cannot afford even 1 of these in the domain of self-improving AI. Thankfully, its harder to accidentally create a self-improving AI than it is to drop a brick in the wrong place at the wrong time.