You're looking at Less Wrong's discussion board. This includes all posts, including those that haven't been promoted to the front page yet. For more information, see About Less Wrong.

Konkvistador comments on Q&A with experts on risks from AI #3 - Less Wrong Discussion

13 Post author: XiXiDu 12 January 2012 10:45AM

You are viewing a comment permalink. View the original post to see all comments and the full post content.

Comments (28)

You are viewing a single comment's thread.

Comment author: [deleted] 12 January 2012 03:59:10PM *  6 points [-]

i) Global warming. While not as urgent or sexy as AI-run-amok, I think it a far more important issue for humankind.

Reading these letters so far, the experts very often make such statements. I think that either they systematically overestimate the likley risk of global warming in itself, which wouldn't be too surprising for a politicized issue (in the US at least), or they feel the need to play it up.

Comment author: jhuffman 12 January 2012 05:40:09PM 2 points [-]

I think a lot of people make this mistake, to think that "very bad things" is equivalently bad to extinction - or even is extinction. It is unlikely that large scale nuclear war will extinguish the species, it is far beyond unlikely that global warning would extinguish humans. It is extremely unlikely large scale biological weapons usage by terrorists or states would extinguish humanity. But because we know for a certain fact that these things could happen and have even come close to happening or are beginning to happen, and because they are so terrible its just not really possible for most people to keep enough perspective to recognize that things not likely to happen really soon but that will eventually be possible are actually much more dangerous in terms of capability for extinction.

Comment author: jsteinhardt 16 January 2012 05:07:38PM 2 points [-]

Or some people place high negative value on half of all humans dying, comparable to extinction.