Manfred comments on Open thread, August 4 - 10, 2014 - Less Wrong Discussion
You are viewing a comment permalink. View the original post to see all comments and the full post content.
You are viewing a comment permalink. View the original post to see all comments and the full post content.
Comments (307)
On the limits of rationality given flawed minds —
There is some fraction of the human species that suffers from florid delusions, due to schizophrenia, paraphrenia, mania, or other mental illnesses. Let's call this fraction D. By a self-sampling assumption, any person has a D chance of being a person who is suffering from delusions. D is markedly greater than one in seven billion, since delusional disorders are reported; there is at least one living human suffering from delusions.
Given any sufficiently interesting set of priors, there are some possible beliefs that have a less than D chance of being true. For instance, Ptolemaic geocentrism seems to me to have a less than D chance of being true. So does the assertion "space aliens are intervening in my life to cause me suffering as an experiment."
If I believe that a belief B has a < D chance of being true, and then I receive what I think is strong evidence supporting B, how can I distinguish the cases "B is true, despite my previous belief that it is quite unlikely" and "I have developed a delusional disorder, despite delusional disorders being quite rare"?
For you to rule out a belief (e.g. geocentrism) as totally unbelievable, not only does it have to be less likely than insanity, it has to be less likely than insanity that looks like rational evidence for geocentrism.
You can test yourself for other symptoms of delusions - and one might think "but I can be deluded about those too," but you can think of it like requiring your insanity to be more and more specific and complicated, and therefore less likely.