"Even if I had an objective proof that you don't find it unpleasant when you stick your hand in a fire, I still think you’d pull your hand out at the first opportunity."
        -- John K Clark

"So often when one level of delusion goes away, another one more subtle comes in its place."
        -- Rational Buddhist

"Your denial of the importance of objectivity amounts to announcing your intention to lie to us. No-one should believe anything you say."
        -- John McCarthy

"How exactly does one 'alter reality'?  If I eat an apple have I altered reality?  Or maybe you mean to just give the appearance of altering reality."
        -- JoeDad

"Promoting less than maximally accurate beliefs is an act of sabotage.   Don't do it to anyone unless you'd also slash their tires."
        -- Black Belt Bayesian

New Comment
13 comments, sorted by Click to highlight new comments since:

Those quotes seem rather weak to me. Especially the last one. Armchair psychology, you're worried about your own propensity towards irrationality, so you seek to master it by focusing on irrationality external to you, as by seeking to wipe it out. Kind of analogous to evangelical christianity. I'm not sure rational heroes and irrational villians in a morality play is as valuable to us trying to build our best models of the world, including of various irrationalities as natural phenomena. Whether we should expend effort to convince people not to engage in various irrationalities is an empirical question, and maybe one that has different answers in each instance.

Well, maybe we should slash some tires sometimes. BBB left the possibility open.

... neither beliefs nor acts of belief, nor decisions, nor even preferences, are reasonable or rational except in the sense that they are reached by procedures or methods that are reasonable or rational. (The phrase 'rational belief' is rather like the phrase 'fast food'.) David Miller Miller, David (2005), 'Induction: Problem Solved'. Out of Error: Ashgate).

The rational process? - Brain electrodynamics... as solid as Maxwell's equations. Two different scientists can be exposed to the same natural world. One hypothesises “It is TRUE that A is a property of the natural world” and the other hypothesises “It is FALSE that A is a property of the natural world”. Both have arrived at this state by the same basic set of above equations and different brain dynamics trajectories resulting from their individuated self-hood, life history and current circumstances. Both are ‘rational’ because their brains are healthy, not because their propositions are true. End of Story.

slashes TGGP's tires

Seconding Hopefully Anonymous on this point. Also emphasizing that not infrequently, when the accuracy of beliefs that a nerd can promote is low due to inferential distance or to gaps in intelligence, nerds tend to give true statements without bridging the inferential distance, predictably promoting less than maximally accurate beliefs, primarily, I would say, motivated by the self-righteous feeling they get from telling the truth and the feeling of superior righteous indignation they get from their perceived inferiors showing ignorance by not understanding. If you are an atheist and a Christian wants to know if you believe in God, there IS NO Bayesian level honest answer, but there never is when dealing with people who aren't at least ancient art rationalists. The most honest answer is "Yes", which will be interpreted as something like "I believe in not eating babies" or "I believe in expressing my loyalty towards people and groups I care about" or the like.

Michael, for what fraction of Christians - particularly the intelligent Christians an OB reader is likely to interact with - do you think that's the case? I have NEVER known someone to think I must be evil or disloyal because I don't believe. Agreed with the general principle and the first paragraph, though, and this might be a trivial nitpick depending on what "or the like" was meant to include.

Michael, well-articulated. BTW I encourage you to start up your blog again.

Nick: Do you imagine that they would tell you so? Also, you are a) young, and b) haven't been in any setting where people come from a large variety of social backgrounds.

Highly intelligent Christians, Dyson for instance, are likely to believe roughly the same things you do but frame them differently. Tegmark 4 = Spinoza's god, for instance.

Hopefully: You and may others. I will if I ever pull together the emotional resources to, which seem unusually high for me. It's very demanding of effort for me to address a large group, most of whom will fail to "get it" whatever I do.

Telling a Christian "I believe in God" to express the idea that "I disfavor eating babies" seems likely to enshroud them in false beliefs.

If they're never forced to recognize that a conflict exists between their various world-beliefs, they'll never update them. Conflict is necessary.

Telling them a convenient lie that doesn't challenge their worldview does not seem to me to be an act of someone who cares for their welfare.

Michael: Good points, but we must mean different things by "highly intelligent," as I've known several traditionally-believing, even Evangelical, Christians who seem to be that (including close friends, who I could probably pick up on if they thought I was evil). Also, I second Caledonian, though of course in day-to-day interaction it's often not worth trying to straighten out ignorance. Add me to the list of people who want you to start blogging again - and you could always make it viewable by invitation only.

I would say that in my experience evangelicals and traditional Christians max out at in the low 140s IQ wise.

[-]John10

"Your denial of the importance of objectivity amounts to announcing your intention to lie to us. No-one should believe anything you say." -- John McCarthy

Gee--I wish I had this one when I was taking that anthropology professor's class!

[-]John20

"Promoting less than maximally accurate beliefs is an act of sabotage. Don't do it to anyone unless you'd also slash their tires."

I disagree with this one. If you scrupulously include every disclaimer and caveat, you'll be too boring for anyone to pay attention to. It's better to be pragmatic. Giving someone an improved but still not maximally-accurate belief is still an improvement.

I propose that the author of this quote is placing a moral value on people possessing maximally accurate beliefs. If so, the author's moral system is incompatible with Standard Utilitarianism, is it not?