wedrifid comments on The elephant in the room, AMA - Less Wrong

22 Post author: calcsam 12 May 2011 02:59PM

You are viewing a comment permalink. View the original post to see all comments and the full post content.

Comments (428)

You are viewing a single comment's thread.

Comment author: wedrifid 12 May 2011 03:31:07PM *  10 points [-]

Well, as it is written, AMA (= Ask Me Anything)

You are clearly not capable of thinking rationally with respect to a fundamental belief where evidence makes the question overdetermined. Why should I listen to you? Especially since if you do start thinking coherently without discarding the absurd premise it will lead you to do, and advocate things that are potentially significantly detrimental to my goals.

To make it easier to answer we could rephrasing the question to the third person: "Wedrifid believes fundamental premise X. Calcsam has a very different fundamental premise Y which gives him different goals and different conclusions. This being the case how should wedrifid respond to behavioural exhortations given by calcsam on a rationalist blog? If wedrifid believed that all calcsam's reasoning was sound except that which produced belief Y how would that change wedrifid's incentives?".

('Why should I listen to you?' is still the basic question. The above just gives background detail to how it is relevant.)

Comment author: XiXiDu 12 May 2011 04:13:46PM *  24 points [-]

You are clearly not capable of thinking rationally with respect to a fundamental belief where evidence makes the question overdetermined. Why should I listen to you?

People who hold obviously incorrect beliefs can still be highly intelligent and productive:

  • Peter Duesberg (a professor of molecular and cell biology at the University of California, Berkeley) "claimed that AIDS is not caused by HIV, which made him so unpopular that his colleagues and others have — until recently — been ignoring his potentially breakthrough work on the causes of cancer."
  • Francisco J. Ayala who “…has been called the “Renaissance Man of Evolutionary Biology” is a geneticist ordained as a Dominican priest. “His “discoveries have opened up new approaches to the prevention and treatment of diseases that affect hundreds of millions of individuals worldwide…”
  • Francis Collins (geneticist, Human Genome Project) noted for his landmark discoveries of disease genes and his leadership of the Human Genome Project (HGP) and described by the Endocrine Society as “one of the most accomplished scientists of our time” is a evangelical Christian.
  • Georges Lemaître (a Belgian Roman Catholic priest) proposed what became known as the Big Bang theory of the origin of the Universe.
  • Kurt Gödel (logician, mathematician and philosopher) who suffered from paranoia and believed in ghosts. “Gödel, by contrast, had a tendency toward paranoia. He believed in ghosts; he had a morbid dread of being poisoned by refrigerator gases; he refused to go out when certain distinguished mathematicians were in town, apparently out of concern that they might try to kill him.”

There are many more examples. All of them are outliers indeed, and I don't think that calcsam has been able to prove that his achievements and general capability to think clearly in some fields does outweigh the heavy burden of being religious. Yet there is evidence that such people do exist and he offers you the chance to challenge him.

Generally I agree with you, but I also think that calcsam provides a fascinating example of the internal dichotomy of some human minds and a case study that might provide insights to how the arguments employed by Less Wrong fail in some cases.

Comment author: shokwave 12 May 2011 04:53:59PM 15 points [-]

People who hold obviously incorrect beliefs can still be highly intelligent and productive:

And one of the concerns I detected in wedrifid's comment (one I share myself) is that if highly intelligent and productive people start doing what obviously incorrect beliefs indicate they should, the world is going to be optimised in a direction I won't like.

Comment author: Eliezer_Yudkowsky 14 May 2011 05:04:07AM 7 points [-]

I kind of think that's already happening. All over the place. All the time. What kind of policy implications did you want to draw from it in this particular instance?

Comment author: shokwave 14 May 2011 05:14:27AM 1 point [-]

Hmm, what policy...

No amount of clear thinking elsewhere can excuse you from being wrong about this one thing. To think so is to treat being right and wrong like a social game, where people with high status gets a free pass on questions with actual answers.

Comment author: Eliezer_Yudkowsky 14 May 2011 05:25:17AM 2 points [-]

Could you please be more specific? What sort of action is being taken here as a result of your worry?

Comment author: shokwave 14 May 2011 02:31:39PM *  2 points [-]

Not voting for religious candidates for Australian Parliament elections.

Comment author: D_Alex 16 May 2011 03:39:11AM 1 point [-]

My inclination would be to discourage posts with undertones of religious propaganda on this site.

Comment author: wedrifid 13 May 2011 05:21:46AM *  0 points [-]

And one of the concerns I detected in wedrifid's comment (one I share myself) is that if highly intelligent and productive people start doing what obviously incorrect beliefs indicate they should, the world is going to be optimised in a direction I won't like.

Exactly! If beliefs like this are just used as verbal symbols for navigating the social world they do relatively minor harm. Once someone with the intelligence, productivity and otherwise rational thinking necessary comes to follow the belief to the logical conclusion comes along things start exploding. Or rationalist communities become modified in a direction that makes them either less pleasant or less effective than I would prefer.

Comment author: timtyler 12 May 2011 04:43:31PM 12 points [-]

I think these kinds of list should always include Donald E. Knuth.

Comment author: gwern 12 May 2011 05:39:55PM 4 points [-]

Maybe we should make a list on the wiki? eg. I'm tempted to add Aumann, but as pointed out, 'There are many more examples' and XiXiDu made his point with the short list.

Comment author: gwern 13 May 2011 04:02:10PM *  0 points [-]

I made the list at http://wiki.lesswrong.com/wiki/Irrationalists

More suggestions welcome. I think I'm going to make a Discussion article on this to get a little more visibility.

Comment author: virtualAdept 12 May 2011 04:45:05PM 3 points [-]

I don't think that examples of people with fundamental, irrational beliefs being good at other things are relevant - calcsam has invited questions specifically about the belief whose rationality is being examined. If he was starting a discussion about mathematics and his points were dismissed due to his Mormon affiliation, your comment wold make more sense to me.

Comment author: nhamann 12 May 2011 07:37:48PM 4 points [-]

Good reminder that reversed stupidity is not intelligence.

Adding to the list: Hans Berger invented the EEG while trying to investigate telepathy, which he was convinced was real. Even fools can make important discoveries.

Comment author: Clippy 13 May 2011 02:37:09PM 4 points [-]

But increasing one's foolishness does not increase the expected rate of discovery.

Comment author: Kutta 14 May 2011 09:13:06PM *  1 point [-]

I think though that holding crazy beliefs is Bayesian evidence for the hypothesis that a person is not a remarkable intellectual contributor to humanity. Wedrifid's "why should I listen to you?" is thus not addressed head-on by a list of crazy people who happened to achieve other worthy stuff.

Comment author: Pavitra 17 May 2011 02:13:36AM *  1 point [-]

If we had no other information about calcsam besides eir religious beliefs, and e were only one of many people potentially worth listening to, and we were processing those many in bulk to try to decide which of them to investigate more expensively closely, then this would be a useful low-cost filter.

However, I don't think it's enough evidence to overcome the other things we do know about em: that e's posting on LW, that e's responding in a generally clear and intelligent manner, etc.

A policy of ignoring people who disagree with you seems like a good way to never notice that you're wrong. And you are wrong -- not necessarily about this particular question, but of all the things you believe there's pretty much guaranteed to be at least one false idea. I'd even go so far as to say that there's probably at least one very important wrong idea in there.

In my opinion, listening to people like calcsam -- intelligent people who disagree with me -- is one of the most plausible vectors for finding out that I'm wrong about something.

Comment author: jasonmcdowell 12 May 2011 09:27:52PM 6 points [-]

Yes. But the reason why we should listen to him is self-evident. He has written things that are valuable. If he maintains his interest in the community here, and the quality is good, he could be a value-multiplier. A catalyst. His writing here is the intersecting part of a Venn diagram, his interests overlapping with Less Wrong.

His allusions to his missionary work are provoking an immune response from many here, including me (not that I write much). I think this is why (from a quote thread):

What frightens us most in a madman is his sane conversation. --Anatole France

Comment author: wedrifid 12 May 2011 10:28:50PM 1 point [-]

His allusions to his missionary work are provoking an immune response from many here, including me (not that I write much). I think this is why (from a quote thread):

I have not been particularly bothered by the missionary allusions but obviously don't consider the posts nearly as valuable as you do. There is an undesirable emphasis on norms and a constant pressure to move things in the direction of 'making the group do set projects' and 'consensus'. This isn't an organisation, it's a blog.

Comment author: Eliezer_Yudkowsky 14 May 2011 05:06:39AM 10 points [-]

This isn't an organisation, it's a blog.

Some of us would like a %$^&ing organization, pardon my French.

Comment author: wedrifid 14 May 2011 06:49:33AM *  7 points [-]

Some of us would like a %$^&ing organization, pardon my French.

You have one.

Injecting LW with a pint of blood from a religious Behemoth will not give you another organisation, charged up with the power of divine effectiveness. It'll cause an autoimmune disease, doing serious neurological damage and causing externally visible disfigurement (unnecessarily cultish vibe), scaring healthy potential recruits away.

If you want to actually enhance the potential practical effectiveness of LW and LW spinoff communities instead take the quickening of an entrepreneur. Or at very least track down and feast on the essence of a successful business professional and an economist or two.

Food for Thought: Holy Books usually don't get implemented at all. Which is usually a good thing. What mainstream religious authorities do when 'implementing Holy Books' is something quite different from implementing holy books - and not something that is necessarily desirable to emulate.

Comment author: jsalvatier 12 May 2011 03:36:44PM 7 points [-]

Too adversarial.

Comment author: wedrifid 12 May 2011 03:46:53PM 18 points [-]

Too adversarial.

No, and I take a mild degree of offence at the accusation. Ask Me Anything taken literally. It is exactly what the 'elephant in the room' is. I am being frank, not adversarial and given calcsam's experiences and the emotional resilience that he would have needed to develop while evangelizing I know I don't have to tiptoe through a minefield to protect his feelings.

If I am obliged to maintain a social facade even in a thread specifically created to asking this question then the only real recourse I would have is to do whatever is appropriate to eliminate the necessity for me to speak bullshit (or act in a misleading way that is analogous to bullshit).

Comment author: jsalvatier 12 May 2011 04:11:09PM *  13 points [-]

I do not object to the subject of your question, but the way you put it. I think this

You are clearly not capable of thinking rationally with respect to a fundamental belief where evidence makes the question overdetermined.

Is what I was reacting to.

Presumably, he disputes that, so for the purposes of your conversation it is not 'clear'. Phrasing this same sentiment as 'I do not believe you are capable of thinking rationally ..., and you will have to convince me otherwise before I listen to you' or something along those lines would be a less adversarial way of asking this question. For example, I think Costanza asks roughly the same question below in a frank way.

Comment author: Clippy 12 May 2011 04:37:40PM *  15 points [-]

I do not object to the subject of your question, but the way you put it.

I differ in that I do object to the subject of User:wedrifid's question, in particular, the part you just excerpted.

If being B1 refuses to update to being B2's beliefs on account of B2 being stupid, and this judgment of B2's stupidity, in turn, is solely based on B2 satisfying B1 =/= B2, then B1 is "begging the question" (assuming a conclusion to prove it).

There are very good arguments to reject religious beliefs; however, when one uses the argument that an exponent of one of them is stupid because they so believe and therefore must not be worth listening to, then one has desensitized one's worldmodel to evidence, locking in any errors one current subscribes to -- and this remains true even if B2 is pure error.

No belief system or decision theory can be judged solely relative to itself; otherwise, it would be impossible to change one's beliefs or decision theory. Because the fact that one possesses a belief system is not definitive evidence of its truth, any belief system must permit situations in which it would update, or else it will indefinitely reproduce the same errors under reflection.

User:wedrifid makes the error in this statement, no matter how well its phrasing is changed to accord with human customs and status systems:

You are clearly not capable of thinking rationally with respect to a fundamental belief where evidence makes the question overdetermined.

Comment author: wedrifid 12 May 2011 05:57:27PM -1 points [-]

Phrasing this same sentiment as 'I do not believe you are capable of thinking rationally ..., and you will have to convince me otherwise before I listen to you'

Ironically those suggestions convey a worse picture of of the opening poster and declare a stricter requirement for what it would take for me to listen. My observation clearly indicated both in the quote you made and in my following paragraph that the flawed thinking is with respect to the religious belief. Further, I don't think (and didn't suggest that) the OP would need to convince me of a specific kind of rational thinking in order for it to be worth listening. Instead I gave him a platform from which to enumerate reasons. The best of those reasons would actually speak of potential instrumental value and not epistemic awesomeness.

Adding "I do not believe" before a statement is actually just redundant a kind of false humility. Eliezer actually wrote a post that touched on this specifically, does anyone recall the reference?

Comment author: thomblake 12 May 2011 11:38:35PM 0 points [-]

You could be thinking of Qualitatively Confused - though that post is mostly about how 'believe' is not quite redundant.

Comment author: Dorikka 12 May 2011 06:06:29PM 1 point [-]

For the sake the question you could answer as though it is something like "given that wedrifid believes X thing that I don't believe how should he behave?"

I completely failed to parse this sentence (and so didn't really understand the next one either.) Could you try phrasing it another way and/or correcting typos, if they're in there?

Comment author: wedrifid 12 May 2011 06:29:33PM 1 point [-]

I completely failed to parse this sentence (and so didn't really understand the next one either.) Could you try phrasing it another way and/or correcting typos, if they're in there?

I edited the paragraph. The meaning is approximately the same but far clearer.