HughRistik comments on Dissenting Views - Less Wrong
You are viewing a comment permalink. View the original post to see all comments and the full post content.
You are viewing a comment permalink. View the original post to see all comments and the full post content.
Comments (207)
I've also had mixed feelings about the concept of being "less wrong." Anyone else?
Of course, it is harder to identify and articulate what is wrong than what is right: we know many ways of thinking that lead away from truth, but it harder to know when ways of thinking lead toward the truth. So the phrase "less wrong" might merely be an acknowledgment of fallibilism. All our ideas are riddled with mistakes, but it's possible to make less mistakes or less egregious mistakes.
Yet "less wrong" and "overcoming bias" sound kind of like "playing to not lose," rather than "playing to win." There is much more material on these projects about how to avoid cognitive and epistemological errors, rather than about how to achieve cognitive and epistemological successes. Eliezer's excellent post on underconfidence might help us protect an epistemological success once we somehow find one, and protect it even from our own great knowledge of biases, yet the debiasing program of LessWrong and Overcoming Bias is not optimal for showing us how to achieve such successes in the first place.
The idea might be that if we run as fast as we can away from falsehood, and look over our shoulder often enough, we will eventually run into the truth. Yet without any basis for moving towards the truth, we will probably just run into even more falsehood, because there are exponentially more possible crazy thoughts than sane thoughts. Process of elimination is really only good for solving certain types of problems, where the right answer is among our options and the number of false options to eliminate is finite and manageable.
If we are in search of a Holy Grail, we need a better plan than being able to identify all the things that are not the Holy Grail. Knowing that an African swallow is not a Holy Grail will certainly help us not not find the true Holy Grail because we erroneously mistake a bird for it, but it tells us absolutely nothing about where to actually look for the Holy Grail.
The ultimate way to be "less wrong" is radical skepticism. As a fallibilist, I am fully aware that we may never know when or if we are finding the truth, but I do think we can use heuristic to move towards it, rather than merely trying to move away from falsehood and hoping we bump into the truth backwards. That's why I've been writing about heuristic here and here, and why I am glad to see Alicorn writing about heuristics to achieve procedural knowledge.
For certain real-world projects that shall-not-be-named to succeed, we will need to have some great cognitive and epistemological successes, not merely avoid failures.
At least one person seems to think that this post is in error, and I would very much like to hear what might be wrong with it.