ciphergoth comments on Ben Goertzel: The Singularity Institute's Scary Idea (and Why I Don't Buy It) - Less Wrong
You are viewing a comment permalink. View the original post to see all comments and the full post content.
You are viewing a comment permalink. View the original post to see all comments and the full post content.
Comments (432)
The nearest thing to such a link is Artificial Intelligence as a Positive and Negative Factor in Global Risk [PDF].
But of course the argument is a little large to entirely set out in one paper; the next nearest thing is What I Think, If Not Why and the title shows in what way that's not what Goertzel was looking for.
44 pages. I don't see anything much like the argument being asked for. The lack of an index doesn't help. The nearest thing I could find was this:
He also claims that intelligence could increase rapidly with a "dominant" probabilty.
This all seems pretty vague to me.