michaelsullivan comments on 2011 Survey Results - Less Wrong

94 Post author: Yvain 05 December 2011 10:49AM

You are viewing a comment permalink. View the original post to see all comments and the full post content.

Comments (513)

You are viewing a single comment's thread. Show more comments above.

Comment author: XiXiDu 04 December 2011 08:12:29PM 1 point [-]

Of possible existential risks, the most feared was a bioengineered pandemic, which got 194 votes (17.8%) - a natural pandemic got 89 (8.2%), making pandemics the overwhelming leader.

This doesn't look very good from the point of view of the Singularity Institute. While 38.5% of all people have read at least 75% of the Sequences only 16.5% think that unfriendly AI is the most worrisome existential risk.

Is the issue too hard to grasp for most people or has it so far been badly communicated by the Singularity Institute? Or is it simply the wisdom of crowds?

Comment author: michaelsullivan 05 December 2011 08:09:28PM 0 points [-]

The phrasing of the question was quite specific: "Which disaster do you think is most likely to wipe out greater than 90% of humanity before the year 2100?"

If I estimate a very small probability of either FAI or UFAI before 2100, then I'm not likely to choose UFAI as "most likely to wipe out 90% of humanity before 2100" if I think there's a solid chance for something else to do so.

Consider that I interpreted the singularity question to mean "if you think there is any real chance of a singularity, then in the case that the singularity happens, give the year by which you think it has 50% probability." and answered with 2350, while thinking that the singularity had less than a 50% probability of happening at all.

Yes, Yvain did say to leave it blank if you don't think there will be a singularity. Given the huge uncertainty involved in anyone's prediction of the singularity or any question related to it, I took "don't believe it will happen" to mean that my estimated chance was low enough to not be worth reasoning about the case where it does happen, rather than that my estimate was below 50%.