shokwave comments on The Curve of Capability - Less Wrong
You are viewing a comment permalink. View the original post to see all comments and the full post content.
You are viewing a comment permalink. View the original post to see all comments and the full post content.
Comments (264)
-sigh-
This crap goes on year after year, decade after bloody decade. Did you know the Singularity was supposed to happen in 2000? Then in 2005. Then in 2010. Guess how many Singularitarians went "oh hey, our predictions keep failing, maybe that's evidence our theory isn't actually right after all"? If you guessed none at all, give yourself a brownie point for an inspired guess. It's like the people who congregate on top of a hill waiting for the angels or the flying saucers to take them up to heaven. They just go "well our date was wrong, but that doesn't mean it's not going to happen, of course it is, Real Soon Now." Every time we actually try to do any recursive self-improvement, it fails to do anything like what the AI foom crowd says it should do, but of course, it's never "well, maybe recursive self-improvement isn't all it's cracked up to be," it's always "your faith wasn't strong enough," oops, "you weren't using enough of it," or "that's not the right kind" or some other excuse.
That's what I have to deal with, and when I asked you for a prediction, you gave me the usual crap about oh well you'll see when the Apocalypse comes and we all die, ha ha. And that's the most polite terms I'm willing to put it in.
I've made it clear how my theory can be falsified: demonstrate recursive self-improvement doing something beyond the curve of capability. Doesn't have to be taking over the world, just sustained improvement beyond what my theory says should be possible.
If you're willing to make an actual, sensible prediction of RSI doing something, or some other event (besides the Apocalypse) coming to pass, such that if it fails to do that, you'll agree your theory has been falsified, great. If not, fine, I'll assume your faith is absolute and drop this debate.
That the Singularity concept pattern-matches doomsday cults is nothing new to anyone here. You looked further into it and declared it false, wedrifid and others looked into it and declared it possible. The discussion is now about evidence between those two points of view. Repeating that it looks like a doomsday cult is taking a step backwards, back to where we came to this discussion from.
rwallace's argument isn't centering on the standard argument that makes it look like a doomsday cult. He's focusing on an apparent repetition of predictions while failing to update when those predictions have failed. That's different than the standard claim about why Singularitarianism pattern matches with doomsday cults, and should, to a Bayesian, be fairly disturbing if he is correct about such a history.
Fair enough. I guess his rant pattern-matched the usual anti-doomsday-cult stuff I see involving the singularity. Keep in mind that, as a Bayesian, it is possible to adjust the value of those people making the predictions instead of the likelihood of the event. Certainly, that is what I have done; I care less for predictions, even from people I trust to reason well, because a history of failing predictions has taught me not that predicted events don't happen, but rather that predictions are full of crap. This has the converse effect of greatly reducing the value of (in hindsight) correct predictions; which seems to be a pretty common failure mode for a lot of belief mechanisms: that a correct prediction alone is enough evidence. I would require the process by which the prediction was produced to consistently predict correctly.