Less Wrong is a community blog devoted to refining the art of human rationality. Please visit our About page for more information.

Belief in Self-Deception

52 Post author: Eliezer_Yudkowsky 05 March 2009 03:20PM

Continuation ofNo, Really, I've Deceived Myself
Followup toDark Side Epistemology

I spoke yesterday of my conversation with a nominally Orthodox Jewish woman who vigorously defended the assertion that she believed in God, while seeming not to actually believe in God at all.

While I was questioning her about the benefits that she thought came from believing in God, I introduced the Litany of Tarski—which is actually an infinite family of litanies, a specific example being:

  If the sky is blue
      I desire to believe "the sky is blue"
  If the sky is not blue
      I desire to believe "the sky is not blue".

"This is not my philosophy," she said to me.

"I didn't think it was," I replied to her.  "I'm just asking—assuming that God does not exist, and this is known, then should you still believe in God?"

She hesitated.  She seemed to really be trying to think about it, which surprised me.

"So it's a counterfactual question..." she said slowly.

I thought at the time that she was having difficulty allowing herself to visualize the world where God does not exist, because of her attachment to a God-containing world.

Now, however, I suspect she was having difficulty visualizing a contrast between the way the world would look if God existed or did not exist, because all her thoughts were about her belief in God, but her causal network modelling the world did not contain God as a node.  So she could easily answer "How would the world look different if I didn't believe in God?", but not "How would the world look different if there was no God?"

She didn't answer that question, at the time.  But she did produce a counterexample to the Litany of Tarski:

She said, "I believe that people are nicer than they really are."

I tried to explain that if you say, "People are bad," that means you believe people are bad, and if you say, "I believe people are nice", that means you believe you believe people are nice.  So saying "People are bad and I believe people are nice" means you believe people are bad but you believe you believe people are nice.

I quoted to her:

  "If there were a verb meaning 'to believe falsely', it would not have any
  significant first person, present indicative."
          —Ludwig Wittgenstein

She said, smiling, "Yes, I believe people are nicer than, in fact, they are.  I just thought I should put it that way for you."

  "I reckon Granny ought to have a good look at you, Walter," said Nanny.  "I reckon
  your mind's all tangled up like a ball of string what's been dropped."
          —Terry Pratchett, Maskerade

And I can type out the words, "Well, I guess she didn't believe that her reasoning ought to be consistent under reflection," but I'm still having trouble coming to grips with it.

I can see the pattern in the words coming out of her lips, but I can't understand the mind behind on an empathic level.  I can imagine myself into the shoes of baby-eating aliens and the Lady 3rd Kiritsugu, but I cannot imagine what it is like to be her.  Or maybe I just don't want to?

This is why intelligent people only have a certain amount of time (measured in subjective time spent thinking about religion) to become atheists.  After a certain point, if you're smart, have spent time thinking about and defending your religion, and still haven't escaped the grip of Dark Side Epistemology, the inside of your mind ends up as an Escher painting.

(One of the other few moments that gave her pause—I mention this, in case you have occasion to use it—is when she was talking about how it's good to believe that someone cares whether you do right or wrong—not, of course, talking about how there actually is a God who cares whether you do right or wrong, this proposition is not part of her religion—

And I said, "But I care whether you do right or wrong.  So what you're saying is that this isn't enough, and you also need to believe in something above humanity that cares whether you do right or wrong."  So that stopped her, for a bit, because of course she'd never thought of it in those terms before.  Just a standard application of the nonstandard toolbox.)

Later on, at one point, I was asking her if it would be good to do anything differently if there definitely was no God, and this time, she answered, "No."

"So," I said incredulously, "if God exists or doesn't exist, that has absolutely no effect on how it would be good for people to think or act?  I think even a rabbi would look a little askance at that."

Her religion seems to now consist entirely of the worship of worship.  As the true believers of older times might have believed that an all-seeing father would save them, she now believes that belief in God will save her.

After she said "I believe people are nicer than they are," I asked, "So, are you consistently surprised when people undershoot your expectations?"  There was a long silence, and then, slowly:  "Well... am I surprised when people... undershoot my expectations?"

I didn't understand this pause at the time.  I'd intended it to suggest that if she was constantly disappointed by reality, then this was a downside of believing falsely.   But she seemed, instead, to be taken aback at the implications of not being surprised.

I now realize that the whole essence of her philosophy was her belief that she had deceived herself, and the possibility that her estimates of other people were actually accurate, threatened the Dark Side Epistemology that she had built around beliefs such as "I benefit from believing people are nicer than they actually are."

She has taken the old idol off its throne, and replaced it with an explicit worship of the Dark Side Epistemology that was once invented to defend the idol; she worships her own attempt at self-deception.  The attempt failed, but she is honestly unaware of this.

And so humanity's token guardians of sanity (motto: "pooping your deranged little party since Epicurus") must now fight the active worship of self-deception—the worship of the supposed benefits of faith, in place of God.

This actually explains a fact about myself that I didn't really understand earlier—the reason why I'm annoyed when people talk as if self-deception is easy, and why I write entire blog posts arguing that making a deliberate choice to believe the sky is green, is harder to get away with than people seem to think.

It's because—while you can't just choose to believe the sky is green—if you don't realize this fact, then you actually can fool yourself into believing that you've successfully deceived yourself.

And since you then sincerely expect to receive the benefits that you think come from self-deception, you get the same sort of placebo benefit that would actually come from a successful self-deception.

So by going around explaining how hard self-deception is, I'm actually taking direct aim at the placebo benefits that people get from believing that they've deceived themselves, and targeting the new sort of religion that worships only the worship of God.

Will this battle, I wonder, generate a new list of reasons why, not belief, but belief in belief, is itself a good thing?  Why people derive great benefits from worshipping their worship?  Will we have to do this over again with belief in belief in belief and worship of worship of worship?  Or will intelligent theists finally just give up on that line of argument?

I wish I could believe that no one could possibly believe in belief in belief in belief, but the Zombie World argument in philosophy has gotten even more tangled than this and its proponents still haven't abandoned it.

I await the eager defenses of belief in belief in the comments, but I wonder if anyone would care to jump ahead of the game and defend belief in belief in belief?  Might as well go ahead and get it over with.

 

Part of the Against Doublethink subsequence of How To Actually Change Your Mind

Next post: "Moore's Paradox"

Previous post: "No, Really, I've Deceived Myself"

Comments (103)

Comment author: Tyrrell_McAllister 05 March 2009 10:29:00PM 29 points [-]

I don't know how well you know this person, so my advice may be unnecessary. But your post gives me the impression that you need to be much more careful about speculating on how her mind works. I think that it's a red flag when you write first that

I can see the pattern in the words coming out of her lips, but I can't understand the mind behind on an empathic level. I can imagine myself into the shoes of baby-eating aliens and the Lady 3rd Kiritsugu, but I cannot imagine what it is like to be her.

. . . and then proceed to make apparently confident declarations about how her mind works, such as

I now realize that the whole essence of her philosophy was her belief that she had deceived herself, and the possibility that her estimates of other people were actually accurate, threatened the Dark Side Epistemology that she had built around beliefs such as "I benefit from believing people are nicer than they actually are."

She has taken the old idol off its throne, and replaced it with an explicit worship of the Dark Side Epistemology that was once invented to defend the idol; she worships her own attempt at self-deception. The attempt failed, but she is honestly unaware of this.

As you yourself have observed, we largely understand other people by taking a portion of our own black-box mind, plugging in a few explicit settings (such as beliefs or experiences), letting the model run for a bit, and seeing what pops out. In particular, to understand how another person makes judgments, we collect their evinced beliefs, try to twiddle some dials until our model expresses the same beliefs, and then let it run for a bit. We then try to peer into the model as best we can, getting as good a picture of its inner workings as introspection allows us. We then take this picture as our hypothesis about how the other person thinks.

But the first quote above is strong evidence that your mind works differently from hers in some highly relevant respects. Therefore, you should be highly skeptical that what is going on in her mind resembles what it took to make the model of her in your own mind match her utterances. But you give me the impression that you haven't been sufficiently skeptical of the match between her mind and your model of it. I think that this has led you astray on several points.

For example, based on what you've written, I don't think that you're using the right model to understand what was going on in her mind when she said, "I believe that people are nicer than they really are." You were led to this confusion because she was not using the word "believe" in the way that you, and your model of her, do. You are using "belief" to mean a feature of a model of how the world is. But that, I expect, is not what she meant. Thus, your remarks here --

I tried to explain that if you say, "People are bad," that means you believe people are bad, and if you say, "I believe people are nice", that means you believe you believe people are nice. So saying "People are bad and I believe people are nice" means you believe people are bad but you believe you believe people are nice.

-- were irrelevant because they do not apply to the sense of the word "believe" that she was using.

For what it's worth, in my model of her, when she said "I believe that people are nicer than they really are," she meant, "When I reflect on my emotional attitude towards people, I see that this attitude is of the sort that, in the absence of its actual cause, could have been caused by a falsely high belief (in your sense) about peoples' niceness."

The actual cause for her emotional attitude is perhaps her "religion". Or perhaps it is something else. Perhaps she has no idea what the actual cause is, or perhaps she thinks she does, but she doesn't really. But none of this implies that she was attributing to herself the belief that people are nicer than she actually believes them to be (where, here, I'm using "belief" in your sense.)

Her utterance seems analogous to someone who walks out of an optometrist's office after having his pupils dilated and says, "Because of those drops the optometrist gave me, I believe the sun is brighter than it really is." If we heard this, we shouldn't conclude that he believes something contradictory, or that he has incorrect beliefs about his beliefs. His word "belief" in this case probably does not mean "best guess about how things really are." Rather, it's a clumsy way to say that some qualities of his experience of the world are as if he had a certain belief (in the sense normally understood). He does not mean to imply that he has any wrong beliefs (in the conventional sense). It would be a mistake to say that his subjective experience of the light is in any way erroneous. After all, it accurately reflects the fact that he had those drops put in his eyes.

Similarly, your interlocutor's statement that she "believes" that people are nicer than they really are referred to a particular quality of her emotional attitude towards them, not to a belief (in your sense) about how they are. In particular, it didn't imply any expectation about how they would behave. That, I expect, is why she was initially taken aback when you asked, "So, are you consistently surprised when people undershoot your expectations?" The problem wasn't, as you appear to think, that she had prevented her own mind from drawing obvious conclusions. The problem was that you (because of her confusing wording) were speaking of her so-called "belief" as though it were a belief in the normal sense, something that should lead to certain expectations about other peoples' actions. But I expect that it wasn't any such thing, notwithstanding her unfortunate choice of words.

Comment author: Eliezer_Yudkowsky 06 March 2009 12:02:26AM 3 points [-]

For what it's worth, in my model of her, when she said "I believe that people are nicer than they really are," she meant, "When I reflect on my emotional attitude towards people, I see that this attitude is of the sort that, in the absence of its actual cause, could have been caused by a falsely high belief (in your sense) about peoples' niceness."

An interesting hypothesis, Tyrrell; but she explicitly explained to me about how, if you think people are nicer than they really are, then this makes you happier.

Comment author: Tyrrell_McAllister 06 March 2009 09:15:27AM 4 points [-]

You're right to call it a mere hypothesis. I hope that I made its tentative nature clear.

But that explanation of hers seems to me to be consistent with my hypothesis. No surprise, because it was part of the data that I was trying to fit when I constructed it.

I would be curious to know more about how she responded when you asked her, "So, are you consistently surprised when people undershoot your expectations?" Did she have anything more to say after repeating the question?

Comment author: Yasser_Elassal 06 March 2009 07:52:47PM 13 points [-]

My hypothesis is that she simply meant, "It makes me happy to pretend that people are nicer than they really are."

Comment author: billswift 06 March 2009 12:34:13AM 2 points [-]

I think the first time Eliezer said he couldn't get into her mind was that he couldn't understand the psychological state she needed to be in to make that statement. The second time - where he was writing about what she believed - he was discussing her apparent epistemological state.

There are significant differences between the two for observers. I can almost never understand someone else's psychological state, but I can often figure out what they are talking about and how they got there epistemologically - that is, what could have caused their stated beliefs.

Comment author: DimitriK 16 November 2014 06:05:27PM 0 points [-]

When I read "i believe people are nicer than they really are" I got the impression her meaning was along the lines of "people are nicer inside than their actions. On reflection, this might be because that's what I believe. It ties in to fundamental attribution error. Peoples actions are based so much on environment and circumstance that if you had a way to truly look into a person I think you'd see a better person than you would have guessed if you only looked at their actions. Most people don't see themselves as evil. They do things we see as evil but in their heads they are doing what they think is good.

Id be interested in hearing what exactly she said that brought on your analysis Eliezer. I realise it was a long time ago, and im not likely to get a reply anyway, but it seems likely to me her statemen came from an intuitive belief in fundamental attribution error. I know I held that belief long before I encountered it first in HPMoR, so its possible for her too.

Comment author: Tyrrell_McAllister 17 November 2014 12:42:52AM 0 points [-]

I think that there's a better chance that he'll see your comment if you reply directly to the post rather than to another comment. At least, I think that that's how it works.

Comment author: findis 29 December 2012 07:29:34PM 9 points [-]

I await the eager defenses of belief in belief in the comments, but I wonder if anyone would care to jump ahead of the game and defend belief in belief in belief? Might as well go ahead and get it over with.

My boyfriend was once feeling a bit tired and unmotivated for a few months (probably mild depression), and he also wanted to stop eating dairy for ethical reasons. He felt that his illness was partly mentally generated. He decided that he was allergic to dairy, and that dairy was causing his illness. Then he stopped eating dairy and felt better!

He told me all this, and also told me that he usually believes he is actually allergic to dairy, and it is hard to remember that he is not. When someone asks how he knows he is allergic to dairy, he says something plausible and false ("The doctor ran blood tests") and believes it if he doesn't stop and think too much.

He believes he is not allergic to dairy, but he believes he believes he is allergic to dairy? Belief-in-belief. But he recognizes this and explained it to me -- so that's a belief-in-belief-in-belief? But it helped him get over his mental illness and stop eating dairy... that's winning.

In general I would say a belief-in-belief is useful if you decide some behaviors are desirable, but some false model of the world better motivates you to behave properly. Belief-in-belief-in-belief is useful if you know too much to think both "Z is true" and "I believe not-Z". Then you tell yourself you have a belief-in-belief.

Disclaimer: This is weird to me and I don't really understand how he pulls it off.

Comment author: kilobug 15 September 2011 01:24:12PM *  9 points [-]

"I believe that people are nicer than they really are." That part made me ponder. Because, actually, it's something I believe, too. So I froze for a while, and looked at that belief. Do I have Escher loops in my belief networks ? Well, maybe, I'm far from being a perfect bayesian, but I can't allow myself to stop here.

My first justification for that thought was : I don't refer to the same thing in the two parts of the sentence. A bit like "sound" can refer to acoustic vibrations, or to a perception, and if you switch from one to the other into the same sentence, you can make a sentence that seems self-contradicting but is still valid.

People is a vast group. Nice or not is a characteristic of a person. So, to attribute "niceness" to people in general, you've to make an aggregate value. There are many ways to make an aggregate value, for example, mean and median. So that sentence could mean something like "I believe the median people to be nicer than the average people" (implying a minority of very un-nice people who drive the average backward, but don't change the median).

But then I thought "Hey, stop. You're trying to find excuses here. That's not really what you meant with that sentence, or you would have said it clearly. Don't find yourself excuses, just face the fact you were doing knots with your believes."

So I tried to dig where this knot could came from. And I think I found it, and it's linked to the first excuse, but not as simple. The problem comes from the fact that I use different algorithms to evaluate the niceness of a single person I'm interacting with (be it a friend/family, or just a passerby asking me "what time is it ?") and to evaluate the overall nicest of "people" in general.

When I evaluate niceness of people in general, I think about the horrors of history, about the Milgram experiment, about the crimes the news love to report about, the scary statistics about the number of husbands who hit their wives, ... And also about the "heroes", those who did risk their lives to hide unknown Jews during WW2, those who did run in a house of fire to save their neighbor. That gives me a mitigated image of people in general, neither very nice nor very un-nice, able of the best and of the worst.

When I evaluate niceness in one single person interacting with me, I tend more to recall my own interactions with individual persons. And in those, I had a few bad memories (like being assaulted to steal my money and cell phone once) but mostly good memories, be it by luck or by selective memory, most of the interactions with others I can remember were mostly positive. So when I interact with a new individual person, I assume there is a huge chance of that person being "nice", even if I have a more mitigated view of humans in general.

That probably comes from deeper, evolutionnary psychology reasons : the individual your interact with are your tribe, they are friendly. People in general are other tribes, not so friendly. But I'm not well versed enough into evolutionnary psychology to go further on that line. But anyway that's, I think, where the contradiction comes from. It may be partly justified by the fact that the median is higher the average, if it really is (which I've no factual evidence of, only a vague feeling). But it mostly comes from just using to different algorithms, which should, in a well-calibrated brain, lead to the same result, but which for many reasons (all the biases, imperfect knowledge, ...) just don't.

Comment author: Oscar_Cunningham 15 September 2011 02:10:05PM 0 points [-]

"I believe the median people to be nicer than the average people"

But if you'd actually meant this you'd have just said "The median people are nicer than the average people". Saying "I believe the median people to be nicer than the average people" would indicate that you didn't believe it but did believe you believed it.

Comment author: wedrifid 15 September 2011 03:55:17PM *  8 points [-]

But if you'd actually meant this you'd have just said "The median people are nicer than the average people". Saying "I believe the median people to be nicer than the average people" would indicate that you didn't believe it but did believe you believed it.

I don't quite agree there. Saying "I believe the median people to be nicer than the average people" indicates that you believe that you believe it but it doesn't indicate that you don't actually believe it. You could say it is neutral with respect to whether or not you actually believe it but not that it indicates outright that you don't.

Comment author: Oscar_Cunningham 15 September 2011 04:30:29PM 0 points [-]

Indeed, but it does hint that you don't actually believe it, otherwise you would have said the simpler thing.

Comment author: thomblake 15 September 2011 04:38:41PM 8 points [-]

I disagree. In general, saying "I believe x" is evidence that you believe x, and therefore cannot be evidence that you do not believe x. I would be interested to see evidence that people usually use "I believe x" in such a way that it can be taken as evidence that one does not believe x.

I believe that people usually use "I believe x" instead of "x" in cases where they want to stress the possibility, however small, that they are wrong. Usual caveats for religious and "I believe in" statements, as well as unrelated senses of 'believe', apply.

Comment author: kilobug 15 September 2011 05:45:03PM 1 point [-]

Yes, that distinction definitely applies to me. Usually when I say "X" it means "I believe X with almost certainty" and when I say "I believe X" indicates that there is some doubt still, maybe a 90% confidence, but not a 99% confidence.

But in that specific case, as Misha said, I didn't need to actually believe it - it was a belief in belief in my chain of thoughts, an attempt to rationalize the initial mistake, that appeared, with further analysis, to not be the real cause of it. Having this as a real belief or not wouldn't change the reasoning.

Comment author: [deleted] 15 September 2011 02:58:36PM *  1 point [-]

And this is, in fact, part of kilobug's point.

But then I thought "Hey, stop. You're trying to find excuses here. That's not really what you meant with that sentence, or you would have said it clearly. Don't find yourself excuses, just face the fact you were doing knots with your believes."

(while we're on the subject, the plural of belief is "beliefs", contrary to all reason)

Comment author: pure-awesome 09 June 2012 11:10:44AM *  8 points [-]

"I wish I could believe that no one could possibly believe in belief in belief in belief..."

You wish you could believe Eliezer? Is this a dliberate stroke of irony or a subconcious hint at the fact that you do have an empathic understanding of the thought processes behind tailoring your own beliefs?

Comment author: hannahelisabeth 12 November 2012 09:49:56AM 4 points [-]

I think the idea behind this is that he wishes reality played out in such a way that, to a rational observer, it would engender belief. It's a roundabout way of saying "I wish reality were such that..."

Comment author: Psy-Kosh 05 March 2009 10:28:46PM 7 points [-]

Hrm... While on the one hand I can look at her position and basically react with a "your mind is entirely alien to me", on the other hand, I can actually imagine being in that state.

That does NOT mean, of course, that it is a reasonable state to be in, but it does seem to be the sort of state that my mind can support.

I guess the basic key is that human minds aren't necessarally naturally consistent. So we can end up in actual inconsistent states. Including states a bit confused about consistency itself.

A bit more of a personal example would be a state I sometimes recall having been in in the past, and have certainly seen in others, would be when one might say something like, oh, I dunno, "and scientifically, the universe is about 13.7 billion years old and earth is about 4.5 billion years old" and of course, the world was created about 6000 years ago."

As near as I can tell, happens is that we almost imagine the "scientific world" and the "religious world" as parallel universes that... are actually the same one, so mentally we keep track of it by keeping track of different things.

The way this works is someone might manage to end up in a state that they completely fail to really face the question of "okay, but if you rewind time a bit, will you see 6000 years ago the universe poofing into existence, or can you go farther back, etc? ie, what ACTUALLY happened in ACTUAL REALITY?"

Then, when facing that question, all sorts of Escher mentality stuff starts forming as a defense. But what I think initially happens, at least in part, is sort of mentally tracking those as being about different subjects, rather than contradictory statements about the same thing. So that one will end up, with "science glasses", visualizing prehistoric humans doing stuff tens of thousands of years ago, while etc etc etc...

At least, that's my own, partly introspective model of what's going on here, of how people can end up in these states.

Comment author: Eliezer_Yudkowsky 06 March 2009 12:05:03AM 5 points [-]

I think that people who had actual mental models of the world would notice a contradiction that large.

People who profess two different beliefs may not see a contradiction. It's just good to profess one, and also good to profess the other, for different reasons. They aren't visualizing a world that, at one time or another, needs to either poof or go on. They're visualizing that "science" and "religion" both seem like good groups to join.

Comment author: Psy-Kosh 06 March 2009 01:13:03AM 8 points [-]

I think that may be part of it, but I'm also thinking back a bit to when I was more religious, and so on, and also thinking about how some people I know seem to talk, and as near as I can tell, there really does seem to be a bit of that.

I'm claiming they're visualizing a world that goes "poof, 'LET THERE BE LIGHT!'". AND visualizing a world that goes farther back, and somehow doing some form of funny doublethink them thinking of those as different worlds that are both in some sense true, while some aspect of them is treating those not as contradictory models, but almost as, well, different worlds. ie, two different "truths" ("but what is truth?" :))

That is, simply holding the contradiction in place, having two "models", not along the lines of two competing models, but that (though they don't actually notice it), they're imagining it more as parallel worlds that, depending on circumstances, they'll consider either one or the other "this world"

They would (usually, see somewhat below) not ever actually say, or even notice that they're thinking that way. In other words, I'd expect if you asked such a person something like "do you believe in a set of parallel realities, one in which the world was spoken into existance ~6000 years ago, and another about 13.7 billion years old or at least certainly older than 6000 years", they'll probably give you funny looks. But I think, without them noticing, something like that is going on in how it's being stored.

And I can speak from personal experience about some of the REALLY weird stuff I used to think in terms of, so it's in part a "pay no attention to the contradiction behind the curtain" situation.

Heck, sometimes when I bring various contradictions up, I'll get responses like "this isn't a debate class" or "this isn't a court room and you're not a lawyer", and basically have it laughed off like that from some family members. (and, of course, the infamous "in your opinion" fully general retort to any position you don't like. :))

I'm not saying this is all of it, but it sure seems to me that something like this is going on in some cases. It may also be what underlies stuff like "I believe people are nicer than they are". That is, statements like that may partly cash out to "I have a couple different models of people, one of which says they're nicer than the other. I hold both of these at the same time, but I call one my belief, and one the actual situation"

At least, when I try to imagine being in a mental state that could provoke me to utter such a statement, ie, when I try to simulate that state on myself, that seems to be what the result "looks like."

Oh, that bit from earlier, well... sometimes it's made a bit explicit.

I've come across some bits of occult philosophy that basically talks about how there can be many histories that are "true" (no, not in the sense a physicist might talk about interference), and they'll explicitly say stuff like the "there's the actual historical history, but that's not the only 'true' one.."

But also just from introspection, well, it does feel to me that in the past I would be in such a state, have multiple models that I wasn't so much treating as competing so much as treating as, well, simply true, in different senses.

The Escher mental tangle can get REALLY strange. :)

Comment author: Swimmer963 22 February 2011 04:25:03PM 2 points [-]

From what I've seen, fundamentalist Christians (this is the only group I've had a chance to speak to) often see the contradiction, and are PROUD of their ability to believe on 'pure faith' despite it. As if it's some kind of accomplishment to say 'wow, god is so powerful that he can even overcome THAT'. I don't know how far they carry through in creating mental models of the world, but I know that their expectations of a world with God in it are VERY different from a world without god, i.e. the node is included in their models. This is a particular religious group where receiving "prophetic words" and visions is common, and the people I knew based their expectations on what "God" said to them in these visions. And were sometimes sorely disappointed, but their 'faith' never seemed to be affected. At the start, they seemed as alien to me as the woman you're describing seemed to you. After befriending some people in this group, I started to understand the geometry of their minds a little bit more. This was nearly a year ago, though, so I have trouble explaining the insights I've had because my mind has gone back to 'how could anybody be that STUPID?'

Comment author: [deleted] 13 August 2012 01:32:23AM 5 points [-]

I am so much a one-level person that my sense of social insincerity has atrophied.

Rational straight man syndrome. So much a truth-finder you forget how to not speak the truth.

Comment author: algekalipso 02 May 2013 07:45:55AM 1 point [-]

This seems to be associated with higher than average testosteron levels. If you inject testosterone to a random man he will very prone to not lie and be overly straightforward.

Comment author: [deleted] 03 May 2013 11:47:18AM 2 points [-]

Sources?

Maybe I should get a blood test.

Comment author: algekalipso 13 May 2013 03:14:47AM 3 points [-]
Comment author: tondwalkar 23 May 2013 02:23:18AM 1 point [-]

I suffer the same symptom. (and have an excessive amount of body hair, not that that's more than negligibly indicative of high testosterone levels)

What's the cheapest/easiest way to get tested? (more out of curiosity than anything else)

Comment author: [deleted] 24 May 2013 12:20:38AM 1 point [-]

If I understand it correctly, you can go to your physician and ask for it. The test itself is quick, requires a blood sample, and I don't think it is very expensive.

Comment author: RobinHanson 05 March 2009 04:29:59PM 13 points [-]

Consistent consciously intended self-deception may be hard. But our minds are designed to produce self-deceptions all the time without us noticing. Just don't look behind the curtain and "let it be", "go with the flow" etc. and you can be as self-deceived as most folks.

Comment author: Marcello 05 March 2009 06:23:36PM *  12 points [-]

If I had been talking to the person you were talking to, I might have said something like this:

<del>Why are you deceiving yourself into believing Orthodox Judaism as opposed to something else? If you, in fact, are deriving a benefit from deceiving yourself, while at the same time being aware that you are deceiving yourself, then why haven't you optimized your deceptions into something other than an off-the-shelf religion by now?</del> Have you ever really asked yourself the question: "What is the set of things that I would derive the most benefit from falsely believing?" Now if you really think you can make your life better by deceiving yourself, and you haven't really thought carefully about what the exact set of things about which you would be better off deceiving yourself is, then it would seem unlikely that you've actually got the optimal set of self-deceptions in your brain. In particular, this means that it's probably a bad idea to deceive yourself into thinking that your present set of self deceptions is optimal, so please don't do that.

OK, now do you agree that finding the optimal set of self deceptions is a good idea? OK, good, but I have to give you one very important warning. If you actually want to have the optimal set of self deceptions, you'd better not deceive yourself at all while you are constructing this set of self deceptions, or you'll probably get it wrong, because if, for example, you are currently sub-optimally deceiving yourself into believing that it is good to believe X, then you may end up deceiving yourself into actually believing X, even if that's a bad idea. So don't self deceive while you're trying to figure out what to deceive yourself of.

Therefore, to the extent that you are in control of your self deceptions, (which you do seem to be) the first step toward getting the best set of self deceptions is to disable them all and begin a process of sincere inquiry as to what beliefs it is a good idea to have.


And hopefully, at the end of the process of sincere inquiry, they discover the best set of self deceptions happens to be empty. And if they don't, if they actually thought it through with the highest epistemic standards, and even considered epistemic arguments such as honesty being one's last defence, slashed tires, and all that.... Well, I'd be pretty surprised, but if I were actually shown that argument, and it actually did conform to the highest epistemic standards.... Maybe, provided it's more likely that the argument was actually that good, as opposed to my just being deceived, I'd even concede.

Disclaimer: I don't actually expect this to work with high confidence, because this sort of person might not actually be able to do a sincere inquiry. Regardless, if this sort of thought got stuck in their head, it could at least increase their cognitive dissonance, which might be a step on the road to recovery.

Comment author: Eliezer_Yudkowsky 05 March 2009 06:54:33PM 5 points [-]

To be clear, she never did say, "I am deceiving myself" or "I falsely believe that there is a God".

Comment author: Marcello 06 March 2009 12:16:49AM 1 point [-]

I stand corrected. I hereby strike the first two sentences.

Comment deleted 05 March 2009 06:30:37PM [-]
Comment author: Marcello 06 March 2009 12:47:39AM *  4 points [-]

well exactly... If the person were thinking rationally enough to contemplate that argument, they really wouldn't need it.

My working model of this person was that the person has rehearsed emotional and argumentative defenses to protect their belief, or belief in belief, and that the person had the ability to be reasonably rational in other domains where they weren't trying to be irrational. It therefore seemed to me that one strategy (while still dicey) to attempt to unconvince such a person would be to come up with an argument which is both:

  • Solid (Fooling/manipulating them into thinking the truth is bad cognitive citizenship, and won't work anyway because their defenses will find the weakness in the argument.)

  • Not the same shape as the argument their defenses are expecting.

Roko: How is your working model of the person different from mine?

Comment author: Annoyance 05 March 2009 07:35:46PM 1 point [-]

"she just did this once, etc. How did she do it? "

By appealing to a non-rational or irrational argument that would lead the person to adopt rationality.

Arguing rationally with a person who isn't rational that they should take up the process is a waste of time. If it would work, it wouldn't be necessary. It's easy to say what course should be taken with a rational person, because rational thought is all alike. Irrational thought patterns can be nearly anything, so there's no way to specify an argument that will convince everyone. You'd need to construct an argument that each person is specifically vulnerable to.

Comment author: billswift 06 March 2009 12:49:40AM 2 points [-]

The problem is that you often don't know until you actually start arguing with them that they are irrational or just confused and misled.

George H Smith has a pretty good essay about arguing with people to convert them to rationality, " Atheism and the Virtue of Reasonableness". For example, he advocates the "Presumption of Rationality" - you should always presume your adversary is rational until he demostrates otherwise. I don't know if the essay is on-line or not, I read it as the second chapter of "Atheism, Ayn Rand, and Other Heresies."

Comment author: David_Gerard 22 February 2011 11:24:45AM -1 points [-]

Irrational thought patterns can be nearly anything, so there's no way to specify an argument that will convince everyone. You'd need to construct an argument that each person is specifically vulnerable to.

Irrational thought patterns can be nearly anything, but of course they strongly tend to form around standard human cognitive biases. This saves a great deal of time.

Comment author: pre 05 March 2009 08:00:09PM *  2 points [-]

I would expect a reply along the lines of: It's precisely because I can't trust my own reasoning when deciding which false beliefs I should have that I accept these which are handed down. I pick Judaism because it's the oldest and thus has shown through memetic competition that it's the strongest set of false beliefs one could have.

Or ..." I pick Christianity because it's the most popular and has therefore proven itself memetically competitive."

I have a lot of friends who think "it's old therefore it must be good to have survived this long" about Tarot and eastern religions etc.

Personally I'd wanna eliminate the false beliefs even if it cost me my mojo, but that's a different set of priorities I guess.

Comment author: David_Gerard 22 February 2011 11:16:38AM -1 points [-]

I have a lot of friends who think "it's old therefore it must be good to have survived this long" about Tarot and eastern religions etc.

In fact, the argument from tradition is considered very strong in alternative medicine in particular and New Age culture in general, even if whatever it is was actually made up^W^Wrediscovered last week.

Comment author: pjeby 06 March 2009 04:55:13AM 17 points [-]

I'll go one step further and defend belief in belief, infinitely regressed. ;-) As you point out, the placebo effect here is simply the expectation of a positive result -- and it applies equally at any level of recursion here.

Humans only need a convincing argument for predicting a positive result, not a rational proof of that prediction! Once the positive result is expected, we get positive emotions activated every time we think of anything linked to that result, leading to self-fulfilling prophecies on every level.

This being the case, one might question whether it's rational to disbelieve in belief, if you have nothing equally beneficial to replace it with.

When it comes to external results, sure, it makes sense to have greater prediction accuracy. But for interior events -- like confidence, creativity, self-esteem, etc. -- biasing one's predictions positively is a significant advantage, as it stabilizes what would otherwise be an unstable system of runaway feedback loops.

People whose systems are negatively biased, on the other hand, can get seriously stuck. They basically hit one little setback and become paralyzed because of runaway negative self-fulfilling prophecy.

(I've been such a person myself, and I've worked with/on many of them. Indeed, it was noticing that other, far less "rational" and "intelligent" individuals were much more confident, calm, and successful than I was, that led me to start seriously investigating the nature of mind and beliefs in the first place, and to begin noting the distinctions between people I dubbed "naturally successful" and those I considered "naturally struggling".)

Comment author: cleonid 06 March 2009 12:21:51AM *  3 points [-]

Voltaire, using rationalist arguments, concluded that “if God did not exist, it would be necessary to invent him”. So could it be that adhering to facts in all situations is essentially an irrational position?

Consider the following statements:

1) Rational humans (unlike rational AI) should aim to be happy.

2) Rational humans should not believe fanciful notions unsupported by empirical evidence.

3) Empirical studies (e.g. http://www.lifesitenews.com/ldn/2008/mar/08031807.html) suggest that humans who believe in such notions are more likely to be happier.

The consequence of the above statements seems to be that a rational human should reject rationality.

Does anyone see flaws in this reasoning?

Comment author: MinibearRex 31 March 2011 10:51:33PM 2 points [-]

3) Empirical studies (e.g. http://www.lifesitenews.com/ldn/2008/mar/08031807.html) suggest that humans who believe in such notions are more likely to be happier.

Are they actually happier, or do they just believe that they're happier? ;)

Comment author: Zuckaschnegge 27 April 2012 07:31:36AM *  0 points [-]

What would the actual difference be? You have a subjective view of your emotions (and anything else anyway). so believing you are happy would be the same as being happy, as long as you are not aware of the fact that you are only believing in your happiness.

Comment author: MinibearRex 28 April 2012 10:08:46PM 1 point [-]

I suspect that there is a difference, but I'm not extremely confident of this. It seems to me that a noticeable fraction of the people I've encountered over my life are in decidedly sub-optimal situations, and could with relative simplicity change to a more optimal lifestyle, yet are convinced that their own lifestyle is the best thing ever.

Comment author: alex_zag_al 05 November 2017 11:13:59PM 0 points [-]

I think that someone who merely believed they were happy, and then experienced real happiness, would not want to go back.

Comment author: Ryan 05 March 2009 11:10:02PM 3 points [-]

I know some people who are like the woman you describe, my own folks might be like that to some extent. I became atheist pretty early on. So I'm not sure that adults who believe in belief are likely to be passing that along to their kids, if they even try. In my case, I put on a show for a while, but when I stopped it was no big deal.

If these people are able to agree with a scientific worldview and not be obstructionist on things like stem cell, but simply want to add "and I believe there is a god" to the end of it, fine. Seems like a natural step towards the end of belief in god entirely.

Comment author: Emile 05 March 2009 04:09:22PM 3 points [-]

To further illustrate the point that self-deception isn't easy: if believe you're shy, you can't just make yourself believe you're not shy.

Maybe you can make yourself believe that you believe that you're not shy, but I don't think you'll reap many benefits from placebo effect - you'll still get nervous when you want to speak up or go talk to a girl you don't know or whatnot. You can't argue yourself logically into self-confidence.

Comment deleted 05 March 2009 06:00:49PM [-]
Comment author: PeteG 05 March 2009 06:55:40PM 2 points [-]

"You should have spent much more of your time in this debate convincing your tangled friend that, if she were to abandon her religious belief (or belief in belief, or whatever), she would still be able to feel good about herself and good about life; that life would still be a happy meaningful place to be."

I don't think Eliezer cared so much to correct someone's one wrong belief as much as he cared to correct the core that makes many such beliefs persist. Would he really have helped her if all his rational arguments failed, but his emotional one succeeded? My guess is that it wouldn't be a win for him or her.

Comment deleted 05 March 2009 07:06:15PM [-]
Comment author: Annoyance 05 March 2009 07:14:57PM 2 points [-]

What use is it to have correct beliefs if you don't know they're correct?

If the belief cannot be conveniently tested empirically, or it would be useless to do so, the only way we can know that our belief is correct is by being confident of the methodology through which we reached it.

Comment deleted 05 March 2009 07:45:54PM *  [-]
Comment author: Annoyance 05 March 2009 07:52:57PM 3 points [-]

"I naturally prefer to have a high level of confidence in my beliefs."

Doesn't that depend on how reliable those beliefs are?

If you're fleeing through the temple pursued by a boulder, you don't want to dither at an intersection, so whichever direction you think you should go at one moment should be constant. But there's no reason why your confidence should be high to avoid dithering; you need merely be stable.

"'ll take the path I belief leads to safety. This will turn out to be a wise choice"

If, and only if, your belief is correct. If your belief is wrong your choice is a disastrous one. Rationality isn't about being right or choosing the best course, it's about knowing that you're right and knowing which is the best course to choose.

Comment deleted 05 March 2009 07:58:09PM [-]
Comment author: Annoyance 05 March 2009 08:10:56PM 2 points [-]

Then I think I agree with you, mostly. If time or a similar limited resource makes rigorous justification too expensive, we shouldn't require it. But whatever we do accept should be minimally justified, even if it's just "I have no idea where to go so I'll pick at random".

I wouldn't look at the map if I were running from the boulder. But I would have looked at it before entering the temple, and you can bet I'd be trying very hard to retrace my steps on the way out, unless I thought I could identify a shortcut. Even then I might not take the gamble.

Comment author: thomblake 05 March 2009 03:33:44PM 4 points [-]

Why destroy placebo effects? According to some stuff Robin Hanson points to, it seems that most of medicine might consist of placebos. Aren't you fighting what wins in favor of the truth?

Comment author: ciphergoth 05 March 2009 10:13:15PM 5 points [-]

There is evidence that placebos work even if you know that they contain no active ingredients, so we may be spared this interesting dilemma!

Comment author: Arandur 01 August 2011 02:27:42AM 2 points [-]

It sounds to me that she simply is using a different definition of "to believe". If she says "I believe people are nicer than they are," I think she means something like, "I choose to act as if people are nicer than they really are, because it is consonant with my sense of morality to do so." It's choosing to give people the benefit of the doubt, knowing they probably don't deserve it.

Comment author: Zuckaschnegge 27 April 2012 07:19:49AM *  0 points [-]

I would much rather think of it the other way around. As far as I know the average person is exactly as nice as the average person is. However, when she said she believed people are nicer than they actually are i guess it is because her estimation of the average niceness of a person is biased and she is actually falsely believing people are worse than they are. This might well be some kind of defense mechanism she developed. Of course if you expect worse than average, the chances of you being positively surprised are way higher than the other way around.

Comment author: artsyhonker 22 February 2011 11:17:42AM 3 points [-]

As a theist, I don't believe in God because I perceive some positive benefit from that belief. My experiences and perceptions point to the existence of God. Of course those experiences and perceptions may be inaccurate and are subject to my own interpretations, so I can't claim that my beliefs are rational. I accept on an intellectual level that my belief could be wrong. This doesn't seem to enable me to stop believing.

However, I am involved in a religious community because there are positive benefits -- chiefly that of being able to compare notes with other people who share my irrational belief in God and my desire to do good work in the world. I can see that there might be positive benefits in religious communities for non-theists, though I don't really see the point.

Comment author: TheOtherDave 22 February 2011 03:06:39PM 2 points [-]

I know several non-theists, including atheists, who belong to religious communities because they value the benefits that such belonging provides. It helps, of course, that they belong to the kinds of religious communities that welcome people like them.

Comment author: taryneast 29 March 2011 03:07:23PM *  2 points [-]

There are also plenty of non-religious communities that one can belong to. These also provide the "benefits of belonging" without having to be the odd one out (ie the person that doesn't actually follow the one major point of the community itself). Therefore I agree with artsyhonker in not seeing the point. I'd only consider it the rational move if there were no such other communities nearby (or none that were attractive).

Comment author: TheOtherDave 29 March 2011 06:25:55PM 0 points [-]

Sure. That makes sense, and if it weren't for my actual experience with people who do seem to get benefits from that group membership that they consider worthwhile, despite also being members of other communities, I would agree with this wholeheartedly.

Of course, it's certainly possible that they're all merely confused and not actually getting benefits they value, or that they could be getting all the same benefits from their other groups and somehow don't realize it.

Comment author: taryneast 29 March 2011 07:18:10PM 3 points [-]

Ah - no - you miss what I was trying to say. They definitely get benefits - not at all confused. I'll try and give an example to explain what I mean - and I'll leave religion out of it for the moment.

Lets say that near to me is the local football club, and the local wildlife-walks group. both of them have a thriving community and are welcoming and interesting people. Thus if I join either one I will be assured of the benefits of belonging to a community.

But lets say that I happen to have absolutely no passion for football, but really enjoy wildlife walks.

So - the rational move for me would be to join the wildlife group, in favour over the football club. not because there are no benefits to the football club, but because I would get even more out of being in a group where I share the passions and interests with the majority of members.

This is kinda what I was driving at. There's nothing wrong with an atheist joining a local christian group to gain the benefits of community... but if there's another local group that has the same sense of community - but founded around a principle that the atheist actually shares... then they'll probably get even more out of it.

Comment author: TheOtherDave 29 March 2011 08:36:23PM 1 point [-]

If, in that situation, I observed you evaluating both groups and choosing to join the football club, that observation would increase my confidence that you are obtaining something of value from the football club that you aren't getting elsewhere, even if I have no clue what that might be.

Comment author: taryneast 30 March 2011 10:34:54AM 1 point [-]

Yup, no argument here. I would be curious to know what it was.

Comment author: TheOtherDave 30 March 2011 01:58:32PM 2 points [-]

(nods) Me too. The impression I've gotten from conversations with my non-theist friends who belong to religious communities is that they provide a more close-knit and mutually committed community than their secular equivalents. This is especially relevant for those with children.

Comment author: taryneast 31 March 2011 11:15:56AM 0 points [-]

Yes, I've found that most (but not all) hobby-based communities tend to be fairly loosely constructed. People are expected to hang around for a few years, perhaps, but not really to contribute more than just some passing time.

Exceptions I've found to this rule are: ethnic/expat groups, parenting support groups, and (strangely) some geeky groups: SF/F (in certain cities), and the SCA.

The latter was my biggest surprise, when I joined. There a third-generation SCAdians... some of whom have a fourth generation on the way.

Comment author: wedrifid 31 March 2011 01:41:50PM 0 points [-]

SCA?

Comment author: Swimmer963 22 February 2011 04:29:15PM 1 point [-]

I was one of those people for a while. I was accepted, I think, because the particular group I hung out with had an overwhelming need to convert people, and couldn't resist a juicy atheist/agnostic specimen like me.

I also sing in a church choir, which is kind of similar except that it's explicit I'm there for the musical education and not the religion.

Comment author: TheOtherDave 22 February 2011 04:54:38PM 2 points [-]

the particular group I hung out with had an overwhelming need to convert people, and couldn't resist a juicy atheist/agnostic specimen like me.

Ah, that's unfortunate.

As far as I can tell, the religious communities my atheist/agnostic church-going friends belong to consider them full-fledged members of the community no more in need of alteration than anybody else, which seems like a much more honest arrangement.

Though, of course, I have no way of knowing for sure.

Comment author: Swimmer963 22 February 2011 04:32:10PM 1 point [-]

I've found the same thing–if you want to actually accomplish good things in the world, it seems more rational to attach yourself to a religious community than not. I have my own reasons for believing that it's morally right to help others, but a lot of the non-religious/atheist people my age haven't really thought about this at all, and religious people my age tend to be VERY involved.

Comment author: prase 22 February 2011 12:59:32PM 1 point [-]

I accept on an intellectual level that my belief could be wrong. This doesn't seem to enable me to stop believing.

Of course it doesn't. To accept that your belief can be wrong isn't the same as accepting that it is wrong. The former is a complete triviality (if person doesn't accept that his particular belief can be wrong, even in principle, either the belief is not a real belief, or the person is seriously irrational). The latter not only may enable you stop believing, but should force you to do so.

Of course those experiences and perceptions may be inaccurate and are subject to my own interpretations, so I can't claim that my beliefs are rational.

As is true for any experiences of any person, and still, a lot of people strive to have rational beliefs. While your formulation seems to imply that you happily accept being irrational. Which leads me to ask why? Is it because you think that rationality (however you define it) isn't always the best way to arrive to true beliefs? Or because you don't always mind having false beliefs? Or some other reason?

Comment author: CuSithBell 22 February 2011 04:08:23PM 0 points [-]

If it's okay with you, would you mind describing these experiences / perceptions and how they led to your particular beliefs? I'd be quite interested in hearing.

Comment author: [deleted] 13 June 2014 06:50:10AM -1 points [-]

Mundus vult decipi, ergo decipiatur

Comment author: haig 06 March 2009 12:45:30PM 2 points [-]

Placebo effects from 'belief in (false) beliefs' only work as long as self-deception is maintainable.

I think the point at which self-deception ceases to work is when you can consciously be aware of it breaking your causal models of the world. Highly intelligent people, or anyone for that matter, cannot continue to deceive themselves into believing in god or unregulated markets, or whatever complex concept take your pick, if you explicitly show how it breaks a model they cannot disagree with. Controversial topics of the day like belief in god, public policy, etc. are not single data points under contention, but tangled balls of causation that must be dealt with in a somewhat parallel fashion--to see the big picture and say, wait a minute that cannot fit unless this, and this, and this, and finally reach a dead end and have to relinquish their starting belief. The more abstract or the more complex a concept is, the easier it is to deceive yourself of its falsehood.

The limits to working memory plays a role here, and if we are to truly be less wrong, we not only have to overcome biases, but we need to amplify our rational intelligence by using tools designed for these specific purposes. What if beliefs such as 'a personal god exists' were as hard to believe in as 'the sky is green'? What if it was explicitly laid out in front of someone that they absolutely could not hold a belief in something because of all the cascading links it breaks in their world model that is confirmed to be 'reality'.

I want to work on such tools.

Comment author: nazgulnarsil 05 March 2009 09:25:14PM 2 points [-]

This is a perfect example of the web that builds itself around even one confusion of a value statement and a factual statement. I fear we all have these lurking.

Comment author: Olle 05 March 2009 09:11:13PM 1 point [-]

I believe the following five things.

(1) Barcelona will not win the Champions League.

(2) Manchester U will not win the Champions League.

(3) Chelsea will not win the Champions League.

(4) Liverpool will not win the Champions League.

(5) I falsely believe one of the statements (1), (2), (3) and (4).

This seems to me like a reasonable counterexample to Wittgenstein's doctrine.

Comment author: Eliezer_Yudkowsky 06 March 2009 12:07:51AM 8 points [-]

You need to work with probabilities, and then make statements about your expected Bayes-score instead of truth or falsity; then you'll be consistent. I have a post on this but I can't remember what it's called.

Comment author: Z_M_Davis 06 March 2009 12:57:01AM 8 points [-]
Comment author: Olle 06 March 2009 07:30:41AM 2 points [-]

topynate: It was only for reasons of space that I listed five events with probability 0.8 each, rather than 1000 events with probability 0.999 each; the modification is obvious.

Eliezer: Point taken.

Comment author: thomblake 05 March 2009 09:15:29PM 1 point [-]

I think Wittgenstein's point was that you're using 'believe' in a strange way. I have no idea what you meant by the above comment; you're effectively claiming to believe and not believe the same statement simultaneously.

If you're using paraconsitent logic, you should really specify that before making a point, so the rest of us can more efficiently disregard it.

Comment author: Olle 05 March 2009 09:28:51PM 2 points [-]

I judge each of the four teams to have probability 0.2 of winning the Champions League. Their victories are mutually exclusive. Hence I judge each of statements (1)-(5) to have probability 0.8.

Comment author: topynate 06 March 2009 12:29:06AM 5 points [-]

Hm. Wittgenstein requires that the meaning be "indicative". In English the indicative mood is used to express statements of fact, or which are very probable. They don't necessarily have to be true or probable, of course, but they express beliefs of that nature. You say "I believe X" when you assign a probability of at least 0.8 to X; 0.8 is probable, but not very probable. Would you state baldly "Barcelona will not win the Champions League", given your probabilities? I doubt it. When you say instead "I believe Barcelona will not win the Champions League", you could equally say "Barcelona will probably not win the Champions League." But this isn't in the indicative mood, but rather in something called the potential/tentative mood, which has no special form in English, but does in some other languages, e.g. daro in Japanese (which has quite a complex system for expressing probability). It's better to just say your degree of belief as a numeric probability.

Comment author: Peterdjones 19 May 2011 03:35:06PM *  1 point [-]

He is illustrating that "belief" has more than one meaning, for all that he hasn't clarified the meanings.

A candidate theory would be belief-as-cold-hard-fact versus beliefs-as-hope-and-commitment.

Consider a politican fighting an election. Even if the polls are strongly against them, they can't admit that they are going to lose as a matter of fact, because that will make the situation worse. They invariably refuse to admit defeat. That is irrational if you treat belief as a solipsistic, pasive registration of facts, but makes perfect sense if you recoginise that beliefs do things in the world and influence other people. If one person commits to something , others can, and that can lead to it becoming a fact.

Treating people as nicer than they are might make them nicer than they were.

Comment author: Peterdjones 12 November 2012 10:51:16AM 0 points [-]

Of course , if "belief" does have these two meanings, the argument against dark side epistemolgoy largely unravels...

Comment author: raptortech97 19 April 2012 09:04:08PM *  1 point [-]

I benefit from believing people are nicer than they actually are.

I empathize with her here. I believe that it is in my advantage to act towards people the way I would act if they were nicer than they actually are. I'll try to parse that out. Let's say Alice is talking to Bob. Cindy, at a different time, also talks to Bob. Bob is a jerk; we assume he is not nice.

  • Alice honestly expects that Bob is nicer than he actually is, and accordingly she is nice to Bob.
  • Cindy honestly expects that Bob is exactly as nice as he actually is, and accordingly she is dismissive of Bob.

I expect that Bob will be nicer towards Alice than towards Cindy. (Warning: This is starting to feel like a belief, suggesting that it is actually a belief in belief.) My theory is that I should act like Alice. Of course, there are alternatives, like simply being to nice to people.

I hope this comment made sense to you. I know I'm pretty confused about it myself now.

Comment author: hannahelisabeth 12 November 2012 09:57:42AM 0 points [-]

I think when you parse this out you realize that there are a lot of other factors at play here, it's not just a "belief in belief" thing.

Treating someone nicely has an influence on how they subsequently treat you and others. So it's not so much that you're believing someone is nice when they're not, it's that you're believing that they do not have a fixed property state of "niceness", that it is variable dependent on conditions, which you can then manipulate to promote niceness, for the benefit of yourself and others.

None of this is belief in belief. When you look closer you see that you are comparing two different things: how nice Bob has been in the past and how nice Bob will be in the present/future, dependent on what type of environment he is in, and you are thus modifying your behavior on the assumption that your contribution to the environment can make it such that Bob will be nice, or at least nicer. And there is evidence to support this assumption, so it's not irrational to expect Bob to be(come) nice when treat him nicely accordingly.

It's just misleading to phrase it as "I benefit from believing perople are nicer than they are," because what you mean by the first "are" (will be) is not the same as what you mean by the second "are" (have been).

Comment author: Peterdjones 12 November 2012 10:29:18AM 0 points [-]

I don't think that would mislead most people, since most people can handle context and don't expect ordinary English phraseology to conform to logical rigour.

Comment author: hannahelisabeth 13 November 2012 03:52:23PM 0 points [-]

My point was that it's misleading to those trying to interpet it directly into a logical statement, which is what Eliezer seemed to be trying to do. I'm sure there are lots of people who could read that sentiment and understand the meaning behind it (or at least a meaning; some people interpret it differently than others). It's certainly possible to comprehend (obviously, otherwise I wouldn't have been able to explain it), but the meaning is nevertheless in an ambiguous form, and it did confuse at least some people.

Comment author: Zuckaschnegge 27 April 2012 07:48:03AM *  0 points [-]

(One of the other few moments that gave her pause—I mention this, in case you have occasion to use it—is when she was |talking about how it's good to believe that someone cares whether you do right or wrong—not, of course, talking about how there actually is a God who cares whether you do right or wrong, this proposition is not part of her religion—

And I said, "But I care whether you do right or wrong. So what you're saying is that this isn't enough, and you also need to believe in something above humanity that cares whether you do right or wrong." So that stopped her, for a bit, because of course she'd never thought of it in those terms before. Just a standard application of the nonstandard toolbox.)

What i think about here is, that whether or not you care about whether she does right or wrong, to her you are an outsider, one who does not know everything she knows, one who has no insight in what she thinks about the things she does, no insight in what she actually intends to do. So in other words you have no real way of judging her doing to be right or wrong. The only way for her to think of someone to overlook her actions, is to actually believe in an omniscent god, im atheist but i still believe there are good things and bad things for me to do. (might not be a rational thought but i think of it as a neccessary one). In other words my conscience is the being overlooking my doings.

So my guess here would be that she might give her conscience a name and form it in a way to fit in with others people consciences(in other words any religious group whatsoever). To her, god might well be her conscience with a name atheists dont like to hear.

Comment author: alexpunct 01 November 2012 05:07:47PM -2 points [-]

I Believe this will be the next form of religion.