Strange7 comments on Reply to Holden on The Singularity Institute - Less Wrong

46 Post author: lukeprog 10 July 2012 11:20PM

You are viewing a comment permalink. View the original post to see all comments and the full post content.

Comments (213)

You are viewing a single comment's thread. Show more comments above.

Comment author: [deleted] 12 July 2012 04:00:10AM *  -1 points [-]

Lately I've been wondering whether it would make more sense to simply try to prevent the development of AGI rather than work to make it "friendly," at least for the foreseeable future. My thought is that AGI carries substantial existential risks, developing other innovations first might reduce those risks. and anything we can do to bring about such reductions is worth even enormous costs. In other words, if it takes ten thousand years to develop social or other innovations that would reduce the risk of terminal catastrophe by even 1% when AGI is finally developed, then that is well worth the delay.

Bostrom has mentioned surveillance, information restriction, and global coordination as ways of reducing risk (and I will add space exploration to make SIRCS), so why not focus on those right now instead of AGI? The same logic goes for advanced nanotechnology and biotechnology. Why develop any of these risky bio- and nanotechnologies before SIRCS? Do we think that effort spent trying to inhibit the development of AGI/bio/nano would be wasted because they are inevitable or at least so difficult to derail that "friendly" AI is our best shot? Where then has a detailed argument been made for this? Can someone point me to it? Or maybe we think SIRCS (especially surveillance) cannot be adequately developed without AGI/bio/nano? But surely global coordination and information restriction do not depend much on technology, so even without the surveillance and with limited space exploration, it still makes sense to further the others as much as possible before finally proceeding with AGI/bio/nano.

Comment author: Strange7 14 July 2012 06:09:03AM 2 points [-]

But surely global coordination and information restriction do not depend much on technology,

Please, oh please, think about this for five minutes. Coordination cannot happen without communication, and global communication depends very much on technology.

Comment author: wedrifid 14 July 2012 09:46:26AM 1 point [-]

Coordination cannot happen without communication

Not technically true. True enough for humans though.

Comment author: [deleted] 14 July 2012 06:30:48AM *  0 points [-]

Well I agree that it is not as obvious as I made out. However, for this purpose it suffices to note that these innovations/social features could be greatly furthered without more technological advances.