timtyler comments on Intelligence explosion in organizations, or why I'm not worried about the singularity - Less Wrong
You are viewing a comment permalink. View the original post to see all comments and the full post content.
You are viewing a comment permalink. View the original post to see all comments and the full post content.
Comments (187)
To me, that just sounds like confusion about the relationship between genetic and psychological evolution.
Um > 1 what. It's easy to make irrefutable predictions when what you say is vague and meaningless.
The point of the article is that if the recursion can work on itself more than a certain amount, then each new insight allows for more insights, as in the case of uranium for a nuclear bomb. > 1 refers to the average amount of improvement that an AGI that is foom-ing can gain from an insight.
What I was trying to say is the factor for corporations is much less than 1, which makes it different from an AGI. (To see this effect, try plugging in .9^x in a calculator, then 1.1^x)