XiXiDu comments on The elephant in the room, AMA - Less Wrong
You are viewing a comment permalink. View the original post to see all comments and the full post content.
You are viewing a comment permalink. View the original post to see all comments and the full post content.
Comments (428)
You are clearly not capable of thinking rationally with respect to a fundamental belief where evidence makes the question overdetermined. Why should I listen to you? Especially since if you do start thinking coherently without discarding the absurd premise it will lead you to do, and advocate things that are potentially significantly detrimental to my goals.
To make it easier to answer we could rephrasing the question to the third person: "Wedrifid believes fundamental premise X. Calcsam has a very different fundamental premise Y which gives him different goals and different conclusions. This being the case how should wedrifid respond to behavioural exhortations given by calcsam on a rationalist blog? If wedrifid believed that all calcsam's reasoning was sound except that which produced belief Y how would that change wedrifid's incentives?".
('Why should I listen to you?' is still the basic question. The above just gives background detail to how it is relevant.)
People who hold obviously incorrect beliefs can still be highly intelligent and productive:
There are many more examples. All of them are outliers indeed, and I don't think that calcsam has been able to prove that his achievements and general capability to think clearly in some fields does outweigh the heavy burden of being religious. Yet there is evidence that such people do exist and he offers you the chance to challenge him.
Generally I agree with you, but I also think that calcsam provides a fascinating example of the internal dichotomy of some human minds and a case study that might provide insights to how the arguments employed by Less Wrong fail in some cases.
And one of the concerns I detected in wedrifid's comment (one I share myself) is that if highly intelligent and productive people start doing what obviously incorrect beliefs indicate they should, the world is going to be optimised in a direction I won't like.
I kind of think that's already happening. All over the place. All the time. What kind of policy implications did you want to draw from it in this particular instance?
Hmm, what policy...
No amount of clear thinking elsewhere can excuse you from being wrong about this one thing. To think so is to treat being right and wrong like a social game, where people with high status gets a free pass on questions with actual answers.
Could you please be more specific? What sort of action is being taken here as a result of your worry?
Not voting for religious candidates for Australian Parliament elections.
My inclination would be to discourage posts with undertones of religious propaganda on this site.
Exactly! If beliefs like this are just used as verbal symbols for navigating the social world they do relatively minor harm. Once someone with the intelligence, productivity and otherwise rational thinking necessary comes to follow the belief to the logical conclusion comes along things start exploding. Or rationalist communities become modified in a direction that makes them either less pleasant or less effective than I would prefer.
I think these kinds of list should always include Donald E. Knuth.
Maybe we should make a list on the wiki? eg. I'm tempted to add Aumann, but as pointed out, 'There are many more examples' and XiXiDu made his point with the short list.
I made the list at http://wiki.lesswrong.com/wiki/Irrationalists
More suggestions welcome. I think I'm going to make a Discussion article on this to get a little more visibility.
I don't think that examples of people with fundamental, irrational beliefs being good at other things are relevant - calcsam has invited questions specifically about the belief whose rationality is being examined. If he was starting a discussion about mathematics and his points were dismissed due to his Mormon affiliation, your comment wold make more sense to me.
Good reminder that reversed stupidity is not intelligence.
Adding to the list: Hans Berger invented the EEG while trying to investigate telepathy, which he was convinced was real. Even fools can make important discoveries.
But increasing one's foolishness does not increase the expected rate of discovery.
I think though that holding crazy beliefs is Bayesian evidence for the hypothesis that a person is not a remarkable intellectual contributor to humanity. Wedrifid's "why should I listen to you?" is thus not addressed head-on by a list of crazy people who happened to achieve other worthy stuff.
If we had no other information about calcsam besides eir religious beliefs, and e were only one of many people potentially worth listening to, and we were processing those many in bulk to try to decide which of them to investigate more expensively closely, then this would be a useful low-cost filter.
However, I don't think it's enough evidence to overcome the other things we do know about em: that e's posting on LW, that e's responding in a generally clear and intelligent manner, etc.
A policy of ignoring people who disagree with you seems like a good way to never notice that you're wrong. And you are wrong -- not necessarily about this particular question, but of all the things you believe there's pretty much guaranteed to be at least one false idea. I'd even go so far as to say that there's probably at least one very important wrong idea in there.
In my opinion, listening to people like calcsam -- intelligent people who disagree with me -- is one of the most plausible vectors for finding out that I'm wrong about something.