JoshuaZ comments on What is bunk? - Less Wrong
You are viewing a comment permalink. View the original post to see all comments and the full post content.
You are viewing a comment permalink. View the original post to see all comments and the full post content.
Comments (101)
Right, in truth none of the three versions really hangs together when you look at the arguments, though they are listed in decreasing order of plausibility.
"Our intuitions about change are linear" -- no they aren't, we attach equal significance to equal percentage changes, so our intuition expects steady exponential change.
"Therefore we can predict with fair precision when new technologies will arrive, and when they will cross key thresholds, like the creation of Artificial Intelligence." -- artificial intelligence, along with flying cars, moon bases and a cure for cancer, refutes this idea by its continued nonexistence.
"To know what a superhuman intelligence would do, you would have to be at least that smart yourself." -- my brother's cat can predict that when it meows, he will put out food for it. He cannot predict whether the cat will eat the food.
"Thus the future after the creation of smarter-than-human intelligence is absolutely unpredictable." -- the future has always been unpredictable, so by that definition we have always been in the Singularity.
"each intelligence improvement triggering an average of>1.000 further improvements of similar magnitude" -- knowing whether a change is actually an improvement takes more than just thinking about it.
"Technological progress drops into the characteristic timescale of transistors (or super-transistors) rather than human neurons." -- technological progress is much slower than the characteristic timescale of neurons.
That doesn't mean the Singularity can't exist by some other definition,
"For example, the old Extropian FAQ used to define the “Singularity” as the Inflection Point, “the time when technological development will be at its fastest” and just before it starts slowing down."
but as Eliezer also points out, this definition does not imply any particular conclusions.
The Penrose version of consciousness is an interesting case. It is clearly something Penrose would be disposed to believe even if it were false (he pretty much says so in The Emperor's New Mind) and we have no way to disprove it. Is it an extraordinary claim? I would be inclined to say so, but there might be room for reasonable disagreement on that. So while I think it is false, I'm not sure I would be confident dismissing it as bunk.
You may be putting to much emphasis on what people would be predisposed to believe. While when evaluating our own probability estimates we should correct for our emotional predispositions, it in no way says anything substantive about whether a given claim is correct or not. Tendencies to distort my map in no way impacts what the territory actually looks like.
Sure, at the end of the day there is no reliable way to tell truth from falsehood except by thorough scientific investigation.
But the topic at hand is whether, in the absence of the time or other resources to investigate everything, there are guidelines that will do better than random chance in telling us what's promising enough to be worth how much investigation.
While the heuristic about predisposition to believe falls far short of certainty, I put it to you that it is significantly better than random chance -- that in the absence of any other way to distinguish true claims from false ones, you would do quite a bit better by using that heuristic, than by flipping a coin.