You're looking at Less Wrong's discussion board. This includes all posts, including those that haven't been promoted to the front page yet. For more information, see About Less Wrong.

FOOM Articles

3 Post author: fowlertm 05 March 2015 09:32PM

I'd like recommendations for articles dealing with slow and hard takeoff scenarios. I already found Yudkowsky's post 'hard takeoff', I know 'Superintelligence' has a section on it, and I think the Yudkowsky/Hanson debate mostly dealt with it.

Is there anything else?

Comments (4)

Comment author: lukeprog 05 March 2015 09:58:13PM 4 points [-]

Besides Superintelligence, the latest "major" publication on the subject is Yudkowsky's Intelligence explosion microeconomics. There are also a few articles related to the topic at AI Impacts.

Comment author: fowlertm 06 March 2015 02:41:37AM 3 points [-]

Both unknown to me, thanks :)

Comment author: owencb 06 March 2015 11:43:27AM *  2 points [-]

I found Intelligence Explosion Microeconomics less helpful for thinking about this than some older MIRI papers:

Comment author: Houshalter 06 March 2015 08:41:12AM 2 points [-]