You're looking at Less Wrong's discussion board. This includes all posts, including those that haven't been promoted to the front page yet. For more information, see About Less Wrong.

TheOtherDave comments on Why an Intelligence Explosion might be a Low-Priority Global Risk - Less Wrong Discussion

3 Post author: XiXiDu 14 November 2011 11:40AM

You are viewing a comment permalink. View the original post to see all comments and the full post content.

Comments (94)

You are viewing a single comment's thread. Show more comments above.

Comment author: TheOtherDave 14 November 2011 09:18:49PM 1 point [-]

I don't have a clear sense of how dangerous a group of amoral fast-thinking miniature Isaac Newtons might be but it would surprise me if there were a particularly important risk-evaluation threshold crossed between 70 million amoral fast-thinking miniature Isaac Newtons and a mere, say, 700,000 of them.

Admittedly, I may be being distracted by the image of hundreds of thousands of miniature Isaac Newtons descending on Washington DC or something. It's a far more entertaining idea than those interminable zombie stories.

Comment author: TimS 14 November 2011 09:58:27PM *  0 points [-]

You are right that 1% of the world population is likely too large. I probably should have said "substantial numbers in existence." I've adjusted my estimate, so amoral Newtons don't worry me unless they are secret or exist (>1000). And the minimum number gets bigger unless there is reason to think amoral Newtons will cooperate amongst themselves to dominate humanity.

Comment author: Logos01 15 November 2011 03:27:28AM 2 points [-]

I don't think the numbers I was referencing quite came across to you.

I was postulating humans:AGIs :: 1:10,000

So not 70,000 Newtons or 70 million Newtons -- 70,000 Billion Newtons.