Bugmaster comments on Intelligence explosion in organizations, or why I'm not worried about the singularity - Less Wrong

13 Post author: sbenthall 27 December 2012 04:32AM

You are viewing a comment permalink. View the original post to see all comments and the full post content.

Comments (187)

You are viewing a single comment's thread. Show more comments above.

Comment author: gwern 28 December 2012 04:55:12PM 1 point [-]

Um. If your "fundamental law" has all these exceptions, that's a good hint that maybe it isn't as fundamental as you thought. The law of gravity doesn't have exceptions. And no, it's not always better to "have the law". Sometimes it is, for practical reasons, and sometimes it's better to devise a better law that doesn't give you so many false positives.

You're missing the point too. Even gravity has exceptions - yes, really, this is a standard topic in philosophy of science because the Laws Of Gravity are so clear, yet in practice they are riddled with exceptions and errors. We have errors so large that Newtonians were forced to postulate entire planets to explain them (not all of which turned out as well as Uranus, Neptune, and Pluto), we have errors which took centuries to be winkled out, and of course errors like Mercury which ultimately could be explained only by an entirely new theory.

And we're talking about real-world statistics: has there ever been a sociology, economics, or biological allometry paper where every single data point was predicted perfectly without any error whatsoever? (If you think this, then perhaps you should consult Tukey and Cohen on how 'the null hypothesis is always false'.)

If we ignore all of that, we get superlinear scaling; but my guess is that if we include it, we would get sublinear scaling as usual -- in terms of overall economic output per single human.

Absolutely; if you measure in certain ways, diminishing returns has clearly set in for humanity. And yet, compared to hunter-gatherers, we might as well be a Singularity.

What does this tell you about the relevance of diminishing returns to Singularity discussions? (Chalmers's Singularity paper deals with this very question, IIRC, if you are interested in a pre-existing discussion.)

Comment author: Bugmaster 01 January 2013 12:38:11PM 1 point [-]

Even gravity has exceptions - yes, really, this is a standard topic in philosophy of science because the Laws Of Gravity are so clear, yet in practice they are riddled with exceptions and errors

In addition to what the others said on this thread, I'd like to say that my main problem was with the author's attitude, not the accuracy of his proposed law -- though the fact that it apparently has glaring holes in it doesn't really help. When you discover that your law has huge exceptions (such as f.ex. "all crustaceans" or "Mercury"), the thing to do is to postulate hidden planets, or discover relativity, or introduce a term representing dark energy, or something. The thing not to do is to say, "oh well, every law has exceptions, this is good enough for me, case closed ! Let's pretend that crustaceans don't exist, we're done".

And we're talking about real-world statistics: has there ever been a sociology, economics, or biological allometry paper where every single data point was predicted perfectly without any error whatsoever?

I'm not sure what you're referring to; of course, no one expects any line to have a correlation of 1.0 at all times. That'd be silly. However, it is almost equally as silly to take a few data points, and extrapolate them far into the future without any concern for what you're doing. Ultimately, you can draw a straight line through any two points, but that doesn't mean that a child will be over 5m tall at age 20 just because he grew 25cm in a year.

Absolutely; if you measure in certain ways, diminishing returns has clearly set in for humanity. And yet, compared to hunter-gatherers, we might as well be a Singularity.

How so ? Perhaps more importantly, if "diminishing returns has clearly set in for humanity" as you say, then what does that tell you for our prospects of bringing about the actual Singularity ?

Comment author: gwern 01 January 2013 06:52:40PM 0 points [-]

In addition to what the others said on this thread, I'd like to say that my main problem was with the author's attitude, not the accuracy of his proposed law -- though the fact that it apparently has glaring holes in it doesn't really help. When you discover that your law has huge exceptions (such as f.ex. "all crustaceans" or "Mercury"), the thing to do is to postulate hidden planets, or discover relativity, or introduce a term representing dark energy, or something. The thing not to do is to say, "oh well, every law has exceptions, this is good enough for me, case closed ! Let's pretend that crustaceans don't exist, we're done".

Well, that's useful advice to the Newtonians, alright - 'hey guys, why did you let the Mercury anomaly linger for decades/centuries? All you had to do was invent relativity! Just ask Bugmaster!'

I wasn't aware West had retired and was eagerly awaiting his Nobel phone call.

However, it is almost equally as silly to take a few data points, and extrapolate them far into the future without any concern for what you're doing. Ultimately, you can draw a straight line through any two points, but that doesn't mean that a child will be over 5m tall at age 20 just because he grew 25cm in a year.

Why do you think the existing dataset is analogous to your silly example?

How so ? Perhaps more importantly, if "diminishing returns has clearly set in for humanity" as you say, then what does that tell you for our prospects of bringing about the actual Singularity ?

Not much.

Comment author: Bugmaster 02 January 2013 08:45:05PM 0 points [-]

Well, that's useful advice to the Newtonians, alright - 'hey guys, why did you let the Mercury anomaly linger for decades/centuries? All you had to do was invent relativity! Just ask Bugmaster!'

There's a difference between acknowledging the problems with your "fundamental law" (once they become apparent, of course) but failing to fix them for "decades/centuries"; vs. boldly ignoring them because "all laws have exceptions, them's the breaks". It's possible that West is not doing the latter, but the article does imply that this is the case.

Why do you think the existing dataset is analogous to your silly example?

Which dataset are you talking about ? If you mean, the growth of cities, then see below.

How so ? Perhaps more importantly, if "diminishing returns has clearly set in for humanity" as you say, then what does that tell you for our prospects of bringing about the actual Singularity ? Not much.

Why not ? If humanity's productive output has recently (relatively speaking) reached the point of diminishing returns, then a). we can no longer extrapolate the growth of productivity in cities by assuming past trends would continue indefinitely, and b). this does not bode well for the Singularity, which would entail an exponential growth of productivity, free of any diminishing returns.

Comment author: gwern 06 January 2013 04:08:46AM 0 points [-]

It's possible that West is not doing the latter, but the article does imply that this is the case.

It didn't sound like that to me. It sounded like some people had absurd standards for scaling phenomena, and he was rightly dismissing them.

If humanity's productive output has recently (relatively speaking) reached the point of diminishing returns,

There's nothing recently about it. Diminishing returns is a pretty general phenomenon which happens in most periods; Tainter documents examples in many ancient settings, and we can find data sets suggesting diminishing returns in the West from long ago. For example, IIRC Murray finds that once you adjust for population growth, scientific achievement has been falling since the 1890s or so.

then a). we can no longer extrapolate the growth of productivity in cities by assuming past trends would continue indefinitely, and b). this does not bode well for the Singularity, which would entail an exponential growth of productivity, free of any diminishing returns.

It doesn't bode much of anything; I referred to you my list of 'what diminishing returns does not imply' for a reason: #1-4 are directly relevant. Diminishing returns does not mean no exponential growth; it does not mean no regime changes, massive accomplishments, breakthroughs, or technologies. It just means diminishing returns; it's just an observation about one unit of input turning into units of output as compared to the previous unit of input and outputs, nothing more and nothing less.

This is obvious if you take Tainter or Murray or any of the results showing any diminishing returns in the past centuries, since those are precisely the centuries in which humanity has done the most extraordinarily well! One could say, with equal justice, that 'this does not bode well' for the 20th century; one could say with equal justice in 1950 that diminishing returns bodes poorly for the computer industry because not only are chip fab prices keeping on increasing ('Moore's second law'), computing power is visibly suffering diminishing returns as it is applied to more and more worthless problems - where once it was used on problems of vital national value (crucial to the survival of the free world and all that is good) worth billions such as artillery tables and H-bomb simulations, now it was being wasted on grad students and businesses.