Dorikka comments on 2011 Survey Results - Less Wrong

94 Post author: Yvain 05 December 2011 10:49AM

You are viewing a comment permalink. View the original post to see all comments and the full post content.

Comments (513)

You are viewing a single comment's thread. Show more comments above.

Comment author: XiXiDu 04 December 2011 08:12:29PM 1 point [-]

Of possible existential risks, the most feared was a bioengineered pandemic, which got 194 votes (17.8%) - a natural pandemic got 89 (8.2%), making pandemics the overwhelming leader.

This doesn't look very good from the point of view of the Singularity Institute. While 38.5% of all people have read at least 75% of the Sequences only 16.5% think that unfriendly AI is the most worrisome existential risk.

Is the issue too hard to grasp for most people or has it so far been badly communicated by the Singularity Institute? Or is it simply the wisdom of crowds?

Comment author: Dorikka 05 December 2011 01:00:06AM 4 points [-]

More that I think there's a significant chance that we're going to get blown up by nukes or a bioweapon before then.