XiXiDu comments on Ben Goertzel: The Singularity Institute's Scary Idea (and Why I Don't Buy It) - Less Wrong

32 Post author: ciphergoth 30 October 2010 09:31AM

You are viewing a comment permalink. View the original post to see all comments and the full post content.

Comments (432)

You are viewing a single comment's thread. Show more comments above.

Comment author: XiXiDu 30 October 2010 12:18:45PM *  3 points [-]

Back in July I've written this as a response to Hughes' comment:

Keep your friends close...maybe they just want to keep the AI crowd as close together as possible. Making enemies wouldn't be a smart idea either, as the 'K-type S^' subgroup would likely retreat from further information disclosure. Making friends with them might be the best idea.

An explanation of the rather calm stance regarding a potential giga-death or living hell event would be to keep a low profile until acquiring more power.

I'm aware of that argument and also the other things you mentioned and don't think they are reasonable. I've written about it before but deleted my comments as they might be very damaging to the SIAI. I'll just say that there is no argument against active measures if you seriously believe that certain people or companies pose existential risks. Hughes' comment just highlights an important observation, that doesn't mean I support the details.

Regarding Al Gore: What it highlights is how what the SIAI says and does is as misleading as what Al Gores does. It doesn't mean that it is irrational but that people draw conclusions like the one Hughes' did based on this superficially contradictory behavior.