Eliezer_Yudkowsky comments on A question of rationality - Less Wrong

4 Post author: mormon2 13 December 2009 02:37AM

You are viewing a comment permalink. View the original post to see all comments and the full post content.

Comments (93)

You are viewing a single comment's thread. Show more comments above.

Comment author: Eliezer_Yudkowsky 15 December 2009 12:55:25AM 3 points [-]

Michael Vassar is much, much better at the H.R. thing. We still have H.R. problems but could now actually expand at a decent clip given more funding.

Unless you're talking about directly working on the core FAI problem, in which case, yes, we have a huge H.R. problem. Phrasing above might sound somewhat misleading; it's not that I hired people for A.I. research but they failed at once, or that I couldn't find anyone above the level of basic stupid failures. Rather that it takes a lot more than "beyond the basic stupid failures" to avoid clever failures and actually get stuff done, and the basic stupid failures give you some idea of the baseline level of competence beyond which we need some number of sds.

Comment author: CronoDAS 15 December 2009 07:45:33AM *  1 point [-]

Yeah, sorry for phrasing it wrong. I guess I should have said

Eliezer had a really difficult time finding anyone to hire as an assistant/coworker at SIAI who didn't immediately suggest something really, really stupid when told about what they were working on.

And yes, I did mean that you had trouble finding people to work directly on the core FAI problem.