You are viewing revision 1.13.0, last edited by John_Maxwell
This page is to collect links to criticism of what seems to be the prevailing view on Less Wrong regarding the possibility of hard takeoff and the importance of FAI.
This thread has a more up-to-date list of critiques.
The Singularity Institute's Scary Idea (and Why I Don't Buy It). The author describes Nick Bostrom as another skeptic regarding the "scary idea"; you can see a question-and-answer session Nick gives on these topics here. (Tip for watching video lectures: you may wish to download the video using an extension for your browser and watch it sped up using VLC.)
John Baez Interviews Eliezer Yudkowsky. (Note: You'll have to scroll to the bottom to read the first part.) Although John finds Eliezer's claims of AI dangers plausible, he isn't persuaded to give up environmentalism in favor of working on FAI.