Eliezer_Yudkowsky comments on A Less Wrong singularity article? - Less Wrong
You are viewing a comment permalink. View the original post to see all comments and the full post content.
You are viewing a comment permalink. View the original post to see all comments and the full post content.
Comments (210)
I agree with you - and I think the SIAI focuses too much on possible future computer programs, and neglects the (limited) superintelligences that already exist, various amalgams and cyborgs and group minds coordinated with sonic telepathy.
In the future where the world continues (that is, without being paperclipped) and without a singleton, we need to think about how to deal with superintelligences. By "deal with" I'm including prevailing over superintelligences, without throwing up hands and saying "it's smarter than me".
throws up hands
Not every challenge is winnable, you know.
Impossible?
Are you saying a human can't beat a group mind or are you and Johnicholas using different meanings of superintelligence?
Also, what if we're in a FAI without a nonperson predicate?