You're looking at Less Wrong's discussion board. This includes all posts, including those that haven't been promoted to the front page yet. For more information, see About Less Wrong.

IlyaShpitser comments on Heading off a near-term AGI arms race - Less Wrong Discussion

7 Post author: lincolnquirk 22 August 2012 02:23PM

You are viewing a comment permalink. View the original post to see all comments and the full post content.

Comments (70)

You are viewing a single comment's thread. Show more comments above.

Comment author: IlyaShpitser 22 August 2012 05:02:38PM *  2 points [-]

I don't think you quite understand the hammer that will come down if anything comes of your questions. Nothing of what you built will be left. I don't think many non-illegal sabotage avenues are open to this community. You can't easily influence the tenure process, and hiring the best researchers is notoriously difficult, even for very good universities/labs.


Re: OP, I think you are worried over nothing.

Comment author: palladias 22 August 2012 05:27:35PM 0 points [-]

That's why I asked whether Less Wrongers would prefer SI to devote more of it's time to slowing down other people's unfriendly AI relative to how much time it spends constructing FAI. I agree, SI staff shouldn't answer.

Comment author: IlyaShpitser 22 August 2012 05:34:36PM *  9 points [-]

I think any sequence of events that leads to anyone at all in any way associated with either lesswrong or SI doing anything to hinder any research would be a catastrophe for this community. At best, you will get a crank label (more than now, that is), at worst the FBI will get involved.

Comment author: David_Gerard 22 August 2012 11:18:57PM 3 points [-]
Comment author: Xachariah 22 August 2012 07:55:59PM -2 points [-]

Yes. It's much better to tile the universe with paperclips than to have this community looked on poorly. How ever could he have gotten his priorities so crossed?

Comment author: Epiphany 25 August 2012 02:02:10AM *  -1 points [-]

If there is a big enough AI project out there, especially if it will be released as freeware, others won't work on it. That would be high-risk and result in a low return on investment.

Three ideas to prevent unfriendly AGI (Scroll to "Help good guys beat the arms race")

Also, I don't think my other two risky AGI deterring ideas aren't do-able simultaneously. Not sure how many people it would take to get those moving on a large enough scale, but it's probably nowhere near as much as making a friendly AGI.

Comment author: Epiphany 25 August 2012 01:34:19AM *  -1 points [-]