CarlShulman comments on The Importance of Self-Doubt - Less Wrong

23 Post author: multifoliaterose 19 August 2010 10:47PM

You are viewing a comment permalink. View the original post to see all comments and the full post content.

Comments (726)

You are viewing a single comment's thread. Show more comments above.

Comment author: CarlShulman 20 August 2010 02:32:19PM *  8 points [-]

And FAI counts as not "supernatural" how?

In the ordinary sense that Richard Dawkins and James Randi use.

In any case, nuclear war, peak oil, global warming, overpopulation attracted a huge number of people who claimed that civilization will end unless this or that will be done.

"If we don't continue to practice agriculture or hunting and gathering, civilization will end."

There are plenty of true statements like that. Your argument needs people who said that such and such things needed to be done, and that they were the ones who were going to cause the things in question. If you list some specific people, you can then identify relevant features that are or are not shared.

Disclaimer: I think near-term efforts to reduce AI risk will probably not be determinative in preventing an existential risk, but have a non-negligible probability of doing so that gives high expected value at the current margin via a number of versions of cost-benefit analysis. Moreso for individual near-term AI risk efforts.