timtyler comments on Ben Goertzel: The Singularity Institute's Scary Idea (and Why I Don't Buy It) - Less Wrong

32 Post author: ciphergoth 30 October 2010 09:31AM

You are viewing a comment permalink. View the original post to see all comments and the full post content.

Comments (432)

You are viewing a single comment's thread. Show more comments above.

Comment author: Perplexed 30 October 2010 01:50:54PM 5 points [-]

Good article. Thx for posting. I agree with much of it, but ...

Goertzel writes:

I do see a real risk that, if we proceed in the manner I'm advocating, some nasty people will take the early-stage AGIs and either use them for bad ends, or proceed to hastily create a superhuman AGI that then does bad things of its own volition. These are real risks that must be thought about hard, and protected against as necessary. But they are different from the Scary Idea.

Is this really different from the Scary Idea?

I've always thought of this as part of the Scary Idea, in fact, the reason the Scary Idea is scary - scarier than nuclear weapons. Because when mankind reaches the abyss, and looks with dismay at the prospect that lies ahead, we all know that there will be at least one idiot among us why doesn't draw back from the abyss, but instead continues forward down the slippery slope.

At the nuclear abyss, that idiot will probably kill a few hundred million of us. No big deal. But at the uFAI abyss, we may have ourselves a serious problem.

Comment author: timtyler 30 October 2010 02:50:27PM *  3 points [-]

The "uFAI abyss"? Does that have something to do with the possibility of a small group of "idiots" - who were nonetheless smart enough to beat everyone else to machine intelligence - overthrowing the world's governments?