ChrisHallquist comments on Baseline of my opinion on LW topics - Less Wrong

7 Post author: Gunnar_Zarncke 02 September 2013 12:13PM

You are viewing a comment permalink. View the original post to see all comments and the full post content.

Comments (6)

You are viewing a single comment's thread.

Comment author: ChrisHallquist 04 September 2013 09:30:29PM 0 points [-]

But I also don't think that this will bring the singularity because of the complexity limits mentioned above. Strong AI will speed up some cognitive tasks with compound interest - but only until the physical feedback level is reached. Or a social feedback level is reached if AI should be designed to be so.

Depends what you mean by "singularity" and, to some extent, "AI." If "artificial general intelligence" means it can do any task a human can, at minimum you've just taken away labor supply as a constraint on economic growth, so now the economy can more or less grow as fast as we can build new hardware. Robin Hanson (an economist) has predict that would lead to the economy doubling in size every month or so. Which isn't as fast of a rate of change as Eliezer is predicting, but still arguably deserves the label "singularity."

Comment author: Gunnar_Zarncke 05 September 2013 09:31:36PM *  0 points [-]

Exponential increase on that scale would be astounding. But it wouldn't be a singularity. Otherwise the grows of human population(s) - which is over long ranges exponential basically for the same reason - would have to be called singularity too. The word 'singularity' should be reserved for poles. Points where the function goes to infinity or leaves previously defined ranges.