You're looking at Less Wrong's discussion board. This includes all posts, including those that haven't been promoted to the front page yet. For more information, see About Less Wrong.

Tordmor comments on Survey: Risks from AI - Less Wrong Discussion

9 Post author: XiXiDu 13 June 2011 01:05PM

You are viewing a comment permalink. View the original post to see all comments and the full post content.

Comments (19)

You are viewing a single comment's thread.

Comment author: [deleted] 13 June 2011 06:59:24PM 0 points [-]
  1. 2030 / 2050 / never (I assign around 10% that not enough people want it enough to ever pull it off)
  2. 20 % / 5 %
  3. negligible / 1% / 20%
  4. don't care. I think this question will be raised throughout the AI community soon enough should it become relevant.
  5. Don't think so. There are other doomsday scenarios both human made and natural with probabilities in the same ballpark
  6. No. I guess computers will have human-level intelligence, but not human-like intelligence, before we will recognized it as such.