You're looking at Less Wrong's discussion board. This includes all posts, including those that haven't been promoted to the front page yet. For more information, see About Less Wrong.

cousin_it comments on The metaphor/myth of general intelligence - Less Wrong Discussion

11 Post author: Stuart_Armstrong 18 August 2014 04:04PM

You are viewing a comment permalink. View the original post to see all comments and the full post content.

Comments (51)

You are viewing a single comment's thread. Show more comments above.

Comment author: cousin_it 19 August 2014 03:34:44PM *  2 points [-]

Yeah, I agree that a truly general intelligence would be the most powerful thing, if it could exist. But that doesn't mean it's the main thing to worry about, because non-general intelligences can be powerful enough to kill everyone, and higher degrees of power probably don't matter as much.

For example, fast uploads aren't general by your definition, because they're only good at the same things that humans are good at, but that's enough to be dangerous. And even a narrow tool AI can be dangerous, if the domain is something like designing weapons or viruses or nanotech. Sure, a tool AI is only dangerous in the wrong hands, but it will fall into wrong hands eventually, if something like FAI doesn't happen first.

Comment author: Stuart_Armstrong 20 August 2014 10:48:59AM 1 point [-]

We seem to have drifted into agreement.