gwern comments on How inevitable was modern human civilization - data - Less Wrong
You are viewing a comment permalink. View the original post to see all comments and the full post content.
You are viewing a comment permalink. View the original post to see all comments and the full post content.
Comments (103)
'singleton' as I've seen it used seems to be one possible Singularity in which a single AI absorbs everyone and everything into itself in a single colossal entity. We'd probably consider it a Bad Ending.
See Nick Bostrom (2005). What is a Singleton?
A singleton is a more general concept than intelligence explosion. The specific case of a benevolent AGI singleton aka FAI is not a bad ending. Think of it as Nature 2.0, supervised universe, not as a dictator.
I stand corrected! Maybe this should be a wiki article - it's not that common, but it's awfully hard to google.
Done.