Eliezer_Yudkowsky comments on Do Earths with slower economic growth have a better chance at FAI? - Less Wrong

30 Post author: Eliezer_Yudkowsky 12 June 2013 07:54PM

You are viewing a comment permalink. View the original post to see all comments and the full post content.

Comments (174)

You are viewing a single comment's thread. Show more comments above.

Comment author: Eliezer_Yudkowsky 12 June 2013 10:40:36PM 3 points [-]

Are these high-IQ folk selectively working on FAI rather than AGI to a sufficient degree to make up for UFAI's inherently greater parallelizability?

EDIT: Actually, smarter researchers probably count for more relative bonus points on FAI than on UFAI to a greater extent than even differences of serial depth of cognition, so it's hard to see how this could be realistically bad. Reversal test, dumber researchers everywhere would not help FAI over UFAI.

Comment author: James_Miller 12 June 2013 11:12:32PM 0 points [-]

I'm not sure, this would depend on their personalities. But you might learn a lot about their personalities while they were still too young to be effective programmers. In one future earth you might trust them and hope for enough time for them to come of age, whereas in another you might be desperately trying to create a foom before they overtake you.

Hopefully, lots of the variance in human intelligence comes down to genetic load, having a low genetic load often makes you an all around great and extremely smart person, someone like William Marshal, and we soon create babies with extremely low genetic loads. If this is to be our future we should probably hope for slow economic growth.