You're looking at Less Wrong's discussion board. This includes all posts, including those that haven't been promoted to the front page yet. For more information, see About Less Wrong.

Gurkenglas comments on Yet More "Stupid" Questions - Less Wrong Discussion

4 Post author: NancyLebovitz 08 September 2013 02:18PM

You are viewing a comment permalink. View the original post to see all comments and the full post content.

Comments (265)

You are viewing a single comment's thread. Show more comments above.

Comment author: Gurkenglas 09 September 2013 05:49:27AM *  0 points [-]

When any speedup of 10% takes a constant amount n of computations, you get, for the computational speed f, the approximating differential equation f' [increase in speed over time] = 0.1f [10% increase] / n/f [time needed for that increase].

This diverges in finite time. Where are you getting exponential growth from?

Comment author: John_Maxwell_IV 09 September 2013 06:10:16AM 0 points [-]

When any speedup of 10% takes a constant amount n of computations

I didn't make this assumption--my model assumes that increasing the brainpower of an already-very-smart intelligence by 10% would be harder for a human AI researcher than increasing the brainpower of a pretty-dumb intelligence by 10%. It is an interesting assumption to consider, however.

Anyway, exponential growth is for quantities that grow at a rate directly proportionate to the quantity. So if you can improve your intelligence at a rate that's a constant multiple of how smart you are, then we'd expect to see your intelligence grow exponentially. Given data from humans trying to build AIs, we should expect this constant multiple to be pretty low. If you want a somewhat more detailed justification, you can take a stab at reading my original essay on this topic (warning: has some bad/incorrect ideas; read the comments).