JoshuaZ comments on The Curve of Capability - Less Wrong

18 Post author: rwallace 04 November 2010 08:22PM

You are viewing a comment permalink. View the original post to see all comments and the full post content.

Comments (264)

You are viewing a single comment's thread. Show more comments above.

Comment author: rwallace 05 November 2010 04:38:50AM -1 points [-]

-sigh-

This crap goes on year after year, decade after bloody decade. Did you know the Singularity was supposed to happen in 2000? Then in 2005. Then in 2010. Guess how many Singularitarians went "oh hey, our predictions keep failing, maybe that's evidence our theory isn't actually right after all"? If you guessed none at all, give yourself a brownie point for an inspired guess. It's like the people who congregate on top of a hill waiting for the angels or the flying saucers to take them up to heaven. They just go "well our date was wrong, but that doesn't mean it's not going to happen, of course it is, Real Soon Now." Every time we actually try to do any recursive self-improvement, it fails to do anything like what the AI foom crowd says it should do, but of course, it's never "well, maybe recursive self-improvement isn't all it's cracked up to be," it's always "your faith wasn't strong enough," oops, "you weren't using enough of it," or "that's not the right kind" or some other excuse.

That's what I have to deal with, and when I asked you for a prediction, you gave me the usual crap about oh well you'll see when the Apocalypse comes and we all die, ha ha. And that's the most polite terms I'm willing to put it in.

I've made it clear how my theory can be falsified: demonstrate recursive self-improvement doing something beyond the curve of capability. Doesn't have to be taking over the world, just sustained improvement beyond what my theory says should be possible.

If you're willing to make an actual, sensible prediction of RSI doing something, or some other event (besides the Apocalypse) coming to pass, such that if it fails to do that, you'll agree your theory has been falsified, great. If not, fine, I'll assume your faith is absolute and drop this debate.

Comment author: JoshuaZ 05 November 2010 04:41:24AM 4 points [-]

So, I'm vaguely aware of Singularity claims for 2010. Do you have citations for people making such claims that it would happen in 2000 or 2005?

I agree that pushing something farther and farther into the future is a potential warning sign.

Comment author: timtyler 05 November 2010 09:49:56AM *  7 points [-]

In the "The Maes-Garreau Point" Kevin Kelly lists poorly-referenced predictions of "when they think the Singularity will appear" of 2001, 2004 and 2005 - by Nick Hogard, Nick Bostrom and Eleizer Yudkowsky respectively.

Comment author: steven0461 05 November 2010 07:49:08PM *  5 points [-]

I agree that pushing something farther and farther into the future is a potential warning sign.

But only a potential warning sign -- fusion power is always 25 years away, but so is the decay of a Promethium-145 atom.

Comment author: JoshuaZ 05 November 2010 09:00:59PM 3 points [-]

But only a potential warning sign -- fusion power is always 25 years away, but so is the decay of a Promethium-145 atom.

Right, but we expect that for the promethium atom. If physicists had predicted that a certain radioactive sample would decay in a fixed time, and they kept pushing up the time for when it would happen, and didn't alter their hypotheses at all, I'd be very worried about the state of physics.

Comment author: rwallace 05 November 2010 04:52:48AM 0 points [-]

Not off the top of my head, which is one reason I didn't bring it up until I got pissed off :) I remember a number of people predicting 2000, over the last decades of the 20th century, I think Turing himself was one of the earliest.

Comment author: JoshuaZ 05 November 2010 04:57:21AM *  4 points [-]

Turing never discussed much like a Singularity to my knowledge. What you may be thinking of is how in his original article proposing the Turing Test he said that he expected that it would take around fifty years for machines to pass the Turing Test. He wrote the essay in 1950. But, Turing's remark is not the same claim as a Singularity occurring in 2000. Turing was off for when we'd have AI. As far as I know, he didn't comment on anything like a Singularity.

Comment author: rwallace 05 November 2010 05:02:06AM 0 points [-]

Ah, that's the one I'm thinking of -- he didn't comment on a Singularity, but did predict human level AI by 2000. Some later people did, but I didn't save any citations at the time and a quick Google search didn't find any, which is one of the reasons I'm not writing a post on failed Singularity predictions.

Comment author: steven0461 05 November 2010 07:27:20PM 3 points [-]

Another reason, hopefully, is that there would always have been a wide range of predictions, and there's a lot of room for proving points by being selective about which ones to highlight, and even if you looked at all predictions there are selection effects in that the ones that were repeated or even stated in the first place tend to be the more extreme ones.