I don't mean to seem like I'm picking on Kurige, but I think you have to expect a certain amount of questioning if you show up on Less Wrong and say:

One thing I've come to realize that helps to explain the disparity I feel when I talk with most other Christians is the fact that somewhere along the way my world-view took a major shift away from blind faith and landed somewhere in the vicinity of Orwellian double-think.

"If you know it's double-think...

...how can you still believe it?" I helplessly want to say.

Or:

I chose to believe in the existence of God—deliberately and consciously. This decision, however, has absolutely zero effect on the actual existence of God.

If you know your belief isn't correlated to reality, how can you still believe it?

Shouldn't the gut-level realization, "Oh, wait, the sky really isn't green" follow from the realization "My map that says 'the sky is green' has no reason to be correlated with the territory"?

Well... apparently not.

One part of this puzzle may be my explanation of Moore's Paradox ("It's raining, but I don't believe it is")—that people introspectively mistake positive affect attached to a quoted belief, for actual credulity.

But another part of it may just be that—contrary to the indignation I initially wanted to put forward—it's actually quite easy not to make the jump from "The map that reflects the territory would say 'X'" to actually believing "X".  It takes some work to explain the ideas of minds as map-territory correspondence builders, and even then, it may take more work to get the implications on a gut level.

I realize now that when I wrote "You cannot make yourself believe the sky is green by an act of will", I wasn't just a dispassionate reporter of the existing facts.  I was also trying to instill a self-fulfilling prophecy.

It may be wise to go around deliberately repeating "I can't get away with double-thinking!  Deep down, I'll know it's not true!  If I know my map has no reason to be correlated with the territory, that means I don't believe it!"

Because that way—if you're ever tempted to try—the thoughts "But I know this isn't really true!" and "I can't fool myself!" will always rise readily to mind; and that way, you will indeed be less likely to fool yourself successfully.  You're more likely to get, on a gut level, that telling yourself X doesn't make X true: and therefore, really truly not-X.

If you keep telling yourself that you can't just deliberately choose to believe the sky is green—then you're less likely to succeed in fooling yourself on one level or another; either in the sense of really believing it, or of falling into Moore's Paradox, belief in belief, or belief in self-deception.

If you keep telling yourself that deep down you'll know—

If you keep telling yourself that you'd just look at your elaborately constructed false map, and just know that it was a false map without any expected correlation to the territory, and therefore, despite all its elaborate construction, you wouldn't be able to invest any credulity in it—

If you keep telling yourself that reflective consistency will take over and make you stop believing on the object level, once you come to the meta-level realization that the map is not reflecting—

Then when push comes to shove—you may, indeed, fail.

When it comes to deliberate self-deception, you must believe in your own inability!

Tell yourself the effort is doomed—and it will be!

Is that the power of positive thinking, or the power of negative thinking?  Either way, it seems like a wise precaution.

New Comment
72 comments, sorted by Click to highlight new comments since:
Some comments are truncated due to high volume. (⌘F to expand all)Change truncation settings
[-]kurige330

I don't mean to seem like I'm picking on Kurige, but I think you have to expect a certain amount of questioning if you show up on Less Wrong and say:

One thing I've come to realize that helps to explain the disparity I feel when I talk with most other Christians is the fact that somewhere along the way my world-view took a major shift away from blind faith and landed somewhere in the vicinity of Orwellian double-think.

I realize that my views do not agree with the large majority of those who frequent LW and OB - but I'd just like to take a moment to recognize that it's a testament to this community that:

A) There have been very few purely emotional or irrational responses.

B) Of those that fall into (A) all have been heavily voted down.

I hope that Kurige comes back to verify this, but I'll bet that when he said

I chose to believe in the existence of God - deliberately and consciously. This decision, however, has absolutely zero effect on the actual existence of God.

he did not mean, "My belief isn't correlated with reality". Rather, I'll bet, he meant exactly what you meant when you said

telling yourself X doesn't make X true

By saying that his choice had no effect on reality, I expect that he meant that his control over his belief did not entail control over the subject of that belief, i.e., the fact of the matter.

His attribution of Orwellian doublethink to himself is far more confusing. I have no idea what to make of that. Maybe your advice in this post is on point there. But the "absolutely zero effect" quote seems unobjectionable.

7kurige
From the original comment: I don't have the original text handy, but a quick search on wikipedia brings up this quote from the book defining the concept: The first sentence and the first sentence alone is the definition I had in my mind when I wrote the comment. It has been quite a while since I last read 1984 and I had forgotten the connotation that to "double-think" is to "deny the existence of objective reality." This was not my intention at all, although, upon reflection, it should have been obvious. This was bad homework on my part; I should have looked the quote up before writing the comment. Instead of focusing on the example of morality that I used in the original comment I'm going to try to step back a bit to clarify my original point... Instead of blind-faith in religious tenants, my world-view currently accommodates two traditionally exclusive systems of belief: religion and science. If one does not agree with the other then my understanding of one or the other is flawed.
4Tyrrell_McAllister
Okay, so, when you say that you engage in "doublethink", do you mean that you simultaneously hold two beliefs that are currently "unreconciled", and which you don't yet know how to reconcile, but which you believe can yet be reconciled? If that's right, then I would be curious to know more about this "unreconciled" relation. Can you give other example of pairs of "unreconciled" beliefs that you hold?
3HughRistik
I'm also having trouble seeing kurige's "doublethink." As you observe, the beliefs are not contradictory. There are various creative ways of reconciling them, such as deism (e.g. "God started the Big Bang"). Whether these reconciliations are true, or reasonable, is another question. Yet they are internally consistent, so there is no contradiction or double-think. I think that this is the closest to a contradiction you have displayed. It doesn't seem like your form of religion excludes the claims of science, but your version of science may exclude the claims of religion. If, in your view, science requires the use of Occam's Razor, and you think belief in God violates Occam's Razor (as I do), yet you continue to believe in God, then I think you would be engaging in double-think. Yet if you don't think that Occam's Razor is valid, or you don't think that belief in God violates it, then I wouldn't claim that you were engaging in double-think without additional information.
1Annoyance
"There are various creative ways of reconciling them, such as deism (e.g. "God started the Big Bang"). Whether these reconciliations are true, or reasonable, is another question." If the purported reconciliation isn't reasonable, it's not a reconciliation, just as an asserted solution to a mathematical problem that doesn't match the requirements isn't an actual solution. If I hit you in the head with a bat, would you accept that God was responsible because your injury wouldn't have occurred if (we presume) the universe had not been set into motion?
4HughRistik
Annoyance said: First, I'm not sure what you are trying to show by your analogy to a mathematical problem, or by your question. When I say that beliefs are reconciled, I am talking about internal consistency. Belief systems can be internally consistent without being true or reasonable. If someone believes X and Y, and they do not contradict each other, then their beliefs are reconciled and internally consistent, even if Y is false or unreasonable. (Unless they hold another belief, Z, which implies that Y is false.) Being wrong or unreasonable is not necessarily double-think. Do you not agree? If we take someone who has seemingly internally consistent, but certain demonstrably false or unreasonable beliefs, then we might wonder if we could dig up a contradiction in their beliefs if we dug hard enough. Take, for instance, a theist who turns out to believe Occam's Razor. In this case, the internal consistency of their beliefs falls apart. Yet even then, this still isn't necessarily double-think. Orwell's definition requires "holding two contradictory beliefs in one's mind simultaneously." If our theist never even thought about their beliefs in God and how they measured up to Occam's Razor, then this would not be double-thinking, it would be lack-of-thinking.
1Annoyance
"When I say that beliefs are reconciled, I am talking about internal consistency. Belief systems can be internally consistent without being true or reasonable." They might not be true, and they might not be reasonable *in regard to a framing system of beliefs and knowledge, but they DO have to be reasonable relative to each other. Saying that God is responsible for the existence of creation does not imply that everything that happens (including evolutionary processes) was designed by God. Evolutionary development as a concept is incompatible with the concept of intentional design. The two beliefs are not compatible with each other.
1tlhonmey
So that raises an interesting question...  Because that's exactly the same as suggesting that, when a programmer uses a code generator algorithm instead of writing every line carefully himself, that he somehow ceases to be the "designer" of the system. And yet, he wrote the code generator, and he gave it the parameters and tweaked them until he got a result that was within his tolerances... It occurs to me that it's really not possible for us to determine whether or not life on this planet was the result of an intelligently-guided design process just by looking at the results.  We'd also have to know what said, hypothetical intelligence's design goals were.   While we're definitely not built the way we would choose to build ourselves given an opportunity -- to hold that up as proof that there was no intelligence involved at all is a pretty arrogant assertion that all "intelligent" beings must think just like humans and share our preferences...
1Amanojack
In other words, it seems you meant "doublethink" in the collective sense based on traditional sentiment, rather than in the actual sense of a logical contradiction between any one specific religious tenet A and any one specific scientific theory B. If there are no actual contradictions, "doublethink" was just an (unfortunate) turn of phrase and there is nothing to be reconciled.
1[anonymous]
From the original comment: I don't have the original text handy, but a quick search on wikipedia brings up this quote from the book defining the concept: The first sentence and the first sentence alone is the definition I had in my mind when I wrote the comment. It has been quite a while since I last read 1984 and I had forgotten the connotation that to "double-think" is to "deny the existence of objective reality." This was bad homework on my part; I should have looked the quote up before writing the comment. Instead of focusing on the example of morality that I used in the original comment I'm going to try to step back a bit to clarify my original point... My world-view currently accommodates two traditionally exclusive systems of belief: religion and science.

"When it comes to deliberate self-deception, you must believe in your own inability!"

That is both contrary to facts, and a pretty effective way to ensure that we won't search for and find examples where we've been deceiving ourselves. Without that search, self-correction is impossible.

"Tell yourself the effort is doomed - and it will be!"

Tell yourself that victory is assured, and failure becomes certain.

2Science Dogood
I'm surprise no one responded to this in 14 years [edit, I think the Hanson and Eliezer thread below addresses it well]. I think I agree with the post that explicit self-deception doesn't work, but automatic self-deception via default selfish attention rationing happens all the time. Similarly, people can choose to be biased even if they can't directly choose beliefs, because it is necessary to have simplifying algorithms to think at all. A common example would be that all the logical razors people use are also biases, and you can explicitly choose to not reply on the razor and keep thinking. I think this is one of the sort of things one can find if they do go looking for cases of accidental self-deception, and people not doing this can put people in a mental trap where they think their beliefs are rational to an unjustified degree.

One question here obviously concerns doxastic voluntarism (DV). You ask:

"If you know your belief isn't correlated to reality, how can you still believe it?"

Is this a rhetorical question aiming to assert that if you know your belief isn't correlated to reality, you can't still believe it"?

If so, then it just isn't clear that you're right. One possibility is that DV is true (there are, of course, many reasons to believe that it is). And, if DV is true, it's likely that different people have different degrees and kinds of control over their beliefs. After all, people differ with regard to all other known cognitive skills. Some irrational folks simply might have a kind of control over their beliefs that others don't have. That's an empirical question. (Though we normally think that folks who are more rational have greater control over their beliefs.)

You might, however, mean: if you know your belief isn't correlated to reality, you shouldn't still believe it.

That's a normative claim, not an empirical, psychological one. If that's what you mean, then you're in effect expressing surprise that anyone can be that irrational. If so, I guess I'm a little surprised at your surprise. It is a fairly pure case, but it seems to me that it's not that unusual to hear things like this.

[-][anonymous]100

When it comes to deliberate self-deception, you must believe in your own inability!

Tell yourself the effort is doomed - and it will be!

Is that the power of positive thinking, or the power of negative thinking? Either way, it seems like a wise precaution.

The positive power of negative thinking. There is a book waiting to happen. Scratch that, google tells me the title is already taken. Either way, the idea is fascinating.

Just what is the difference between deceiving yourself and 'positive thinking'? It is clear that Eleizer advocates telling yourself things that may not actually be true. You may tell yourself "I cannot believe what I know is not true". In some cases you may know yourself well enought to estimate that there is only a 40% chance that the claim could ever reasonably qualify as true no matter how dilligent your pep-talking may be, yet it may still be worth a try. On first glance that seems like it is 60% self deception. Yet there is some sort of difference.

When we go about affirming to ourself that "I am charming, assertive, have an overwhelming instinct to maintain reflective consistency and am irresistible to the opposite sex" we are not so muc... (read more)

8Tyrrell_McAllister
I don't think so. He is advocating telling yourself something on the condition that telling it to yourself causes it to be true. It's not equivalent to telling yourself "I'm attractive to the opposite sex." Say that you doubted this prior to uttering it. Then, yes, after uttering it, you might have reason to think that it is marginally more likely to be true. But you almost certainly wouldn't be justified in believing it with high confidence. That is, you still shouldn't believe the statement, so telling it to yourself is dishonest. In contrast, Eliezer is suggesting that perhaps regularly uttering the statement does alter you so as to make itself true. If that's right, then, conditioned on your having uttered it, you are justified in believing what you uttered, so you are not being dishonest. It's not a matter of being outside of reality. The utterance is part of reality. That's precisely why it may have the power to cause itself to be true. Of course, it may be that this particular statement just doesn't have that power. If the probability of that were above a certain threshold, I expect that Eliezer wouldn't advocate saying it unless it's true already.
4HalFinney
What evidence is there that yelling at yourself like this is going to make a difference? Let us imagine two kinds of people: those who cannot fall into Moore's paradox (believing the map but not the territory) and those who can. People in the first class, who are immune to the problem, will gain no benefit from reciting these mantras. People in the second class, for whom there is a real risk of making these kinds of errors, are supposed to vigorously tell themselves that there is no such risk! They are supposed to lie to themselves in the hope that the lie will become true. But why should they believe it? And how different is this lie, really, from the wannabe god-worshiper who similarly insists to himself that he believes that god exists, even though it is not true? I can't help wondering whether this posting is meant to be ironic. It comes perilously close to outright self-contradiction.
5AnnaSalamon
Hal, perhaps Eliezer's view is that there are "suggestible" portions of one's mind that it is okay to suggest things to, but there is some other, reason-capable faculty that one can and should use to form true, un-self-deceived, evidence before bottom line, beliefs. Whether or not that's Eliezer's view, the above view seems right to me. It would be silly not to suggest useful frames, emotional stances, energy levels, etc. to the less rational parts of myself -- that would leave me freezing in particular, arbitrary/chance/un-useful starting states. But for the part of myself that can do full cost-benefit analyses, and math, and can assemble my best guess about the world -- misleading that part of myself would be terrifying, like putting my eyes out. (I mean, I deceive the reason-capable part of myself all the time, like most humans. But it's terrifying that I do, and I really really want to do otherwise... including by suggestibility tricks, if they turn out to help.)
4Eliezer Yudkowsky
Tyrrell and Anna have stated my views better than I'd previously gone so far as verbalizing. There are large sectors of the mind in which belief tends to become reality, including important things like "I am the sort of person who continues even in the face of adversity" and "I do have the willpower to pass up that cookie." But - given that you aren't actually trying to fool yourself - there's a chicken-and-egg aspect that depends on your having enough potential in this area that you can legitimately believe the statement will become true if you believe it. At that point, you can believe it and then it will be true. There's an interesting analogy here to Lob's Theorem which I haven't yet categorized as legitimate or fake. To look at it another way, this sort of thing is useful for taking simultaneous steps of self-confidence and actual capability in cases where the two move in lockstep. Or, in the case of anti-competencies like doublethink, the reverse.
6abigailgem
"I have the potential to be the sort of person who continues even in the face of adversity", or "it is more in my interests to pass up that cookie", or "I really do have a choice whether or not to pass up that cookie". That is what I would recommend. bill, below, has mentioned "Act as if": "I choose to Act as If I can continue even in the face of adversity, and I intend in this precise moment to continue acting, even if I may just fall down again in two minutes' time". These have the advantages of being more likely to be true. Rambling on a little, to be the sort of person who continues in the face of adversity is Difficult, and requires practice, and that practice is very worthwhile. Stating that it is True might make you fail to do the practice, and instead beat yourself up when it appears not to be true.
1pjeby
Dishonest or not, convincing yourself that you're attractive to the opposite sex is more likely to produce a positive result. And a rationalist should win. ;-)
3zaph
Sorry for the pedantry, but I believe that's Philip K. Dick's quote. To the "sky is green" idea, I'd counter that the verification path might not work for converting people to atheism. Mormons for instance, suggest to people they will feel a burning in their heart when they read the Book of Mormon, which proves the books veracity. You need to logically piece together that any such physical sensation wouldn't be sufficient to objectively verify anything. There isn't an easy falsification of religious/magical thinking, just following chains of inference from observation. Non-believers just make a commitment to the minimal contortion of facts to fit their paradigm. As obvious as the Silence seems to be, some people don't seem to hear it.

I was under the impression that Doublethink involved contradictory ideas, Kurige seem to be talking about descriptions that are not inherently contradictory.

On the subject of not being able to update, I know of an anorexic who claims that even if she were rail-thin, she would be a fat person in a thin body. The knowledge of thinness does not affect the internal concept of self-fatness. (probably formed during childhood)

http://lesswrong.com/lw/r/no_really_ive_deceived_myself/gl I don't think I'd call my situation self deception. I am not making myself &q... (read more)

4theotetia
Wow. I love the flat-vs.-round elaboration of the map metaphor. I had never thought about it that way. My thoughts just got way more interesting. Thanks.

I chose to believe in the existence of God - deliberately and consciously. This decision, however, has absolutely zero effect on the actual existence of God.

If you know your belief isn't correlated to reality, how can you still believe it?

To be fair, he didn't say that the actual existence of God has absolutely zero effect on his decision to believe in the existence of God.

His acknowledgement that the map has no effect on the territory is actually a step in the right direction, even though he has many more steps to go.

1mamert
My thoughts exactly. Seeing that statement, I must absolutely AGREE with the second part, and only politely point out that he should rephrase the first part, working "probability" and "working hypothesis" into it.

It seems to me you are trying to deceive yourself into thinking that you cannot comfortably self-deceive. Your effort may indeed make it harder to self-deceive, but I doubt it changes your situation all that much. Admit it, you are human, and within the usual human range of capabilities and tendencies for self-deception.

4Eliezer Yudkowsky
Thus did I carefully write, "cannot deliberately self-deceive", not, "cannot self-deceive".
9RobinHanson
We have a continuum of degrees of deliberation to our actions. Even if I agree that you cannot self-deceive at the strongest degree of deliberation, that isn't in practice much of a restriction on your ability to self-deceive.
4Eliezer Yudkowsky
Might seem that way to you because you don't actually go around all day saying, "And now I shall doublethink myself into believing X!" Deliberate self-deception is a subset of self-deception well worth slicing off the carcass. E.g. Utilitarian from OB. Just because the boundary of deliberate self-deception is fuzzy, doesn't mean the boundary is not worth drawing. The more so in this particular case, as if you wonder "Is this a deliberate self-deception that I can't get away with, or a non-deliberate one that I might still be able to pull off?" it has already reached the point of being deliberate. (Repeating this to yourself will make it even more true.)
2Roko
It may be the case that you can easily self deceive if and only if you think you can self deceive, in which case robin's comment is an attempt to cause Eliezer serious brain damage...
[-]Roko60

" Kurige: One thing I've come to realize that helps to explain the disparity I feel when I talk with most other Christians is the fact that somewhere along the way my world-view took a major shift away from blind faith and landed somewhere in the vicinity of Orwellian double-think."

  • I defy the data! Have you considered the possibility that kurige is a troll? This is an exceptionally weird statement even for a Christian...
-10[anonymous]

It sounds like you don't really believe that double-think is impossible; you just have belief in belief in the impossibility of double-think, because you think that belief would be a useful one to have.

As soon as you start "trying to instill a self-fulfilling prophecy", you're going down the same road as the people who say "I believe in God because I think it's useful to have a belief in God."

To be clear, if you're trying to make it impossible for yourself to double-think by planting that thought in your head, that may be a rational st... (read more)

Maybe we need to split this into two words. Belief for when it is not supported by fact, or even against the evidence. I mean I've never heard anybody say, "I believe in gravity". Maybe use the phrase "I accept" for supported ideas, as in "I accept quantum mechanics" or "I accept that god does not exist". "Accept" also seems to have less affect than "believe", which may make it easier to change your mind if the evidence changes.

7bill
"Act as if" might work. For example, I act as if people are nicer than they are (because it gets me better outcomes than other possible strategies I've tried). This also has the benefit of clearly separating action (what we can do) from information (what we know) and preferences (what we want).
1Eliezer Yudkowsky
"I accept that..." sounds like it could be useful in a lot of cases. Consider the more swiftly apparent incoherence: "I accept that people are nicer than they are." Maybe this is the word we should've been using all along!
0Document
Currently wondering if synonyms for belief in different contexts should be a page on the wiki.
0Document
Other substitutes: "it's clear to me that" and "I recognize".
1thomblake
No, in ordinary English, 'believe' means believe - but it also means 'accept' or 'endorse' or various other sorts of things. If we're going to be entrusted with eradicating a common usage (ha) then I say let 'believe' only mean believe. Thus, here, the assertion "I believe X" should be taken to be equivalent to the assertion "X".
1Annoyance
"Thus, here, the assertion "I believe X" should be taken to be equivalent to the assertion "X"." We can believe something without asserting it to be true. "I assert X to be true", likewise, doesn't require that we believe X to be true. All sorts of arguments involve assertions of truth that we don't necessarily extend beyond the argument. It's something like the empty set: when the null symbol is bracketed, the result doesn't mean "the empty set". Empty brackets, or the null by itself, means that.
2thomblake
Asserting something one does not believe is lying. By the principle of charity we should assume our fellows are not lying, in which case "X" implies "I believe X". Obviously, that's only halfway to equivalence. If I were to say, "I believe that the president is John McCain", and you responded by disputing my claim that the president is John McCain, I would be out of line to respond that I had never asserted that the president is John McCain. Similarly for the exchange "I believe that Annoyance is Caledonian" "But I'm not Caledonian" "I didn't say you were". And so they are equivalent, unless you deny the principle of charity or have a counterexample for my second point.

If I am capable of deliberate self deception, I want to believe that I am capable of deliberate self deception.
If I am not capable of deliberate self deception, I want to believe that I am not capable of deliberate self deception.

A real-world instance of Moore’s Paradox (“It’s raining, but I don’t believe it is”) occurs several times annually at Autzen Stadium in Eugene, Oregon —

https://en.m.wikipedia.org/wiki/Autzen_Stadium

 [quote:]

Since 1990, Don Essig, the stadium's PA announcer since 1968, has declared that "It never rains at Autzen Stadium" before each home game as the crowd chants along in unison. He often prefaces it with the local weather forecast, which quite often includes some chance of showers, but reminds fans that "we know the real forecast..." or "let's tell our f

... (read more)

I find it amusing that in this article, you are advocating the use of deliberate self-deception in order to ward yourself against later deliberate self-deception.

That said, I feel the urge to contribute despite the large time-gap, and I suspect that even if later posts revisit this concept, the relevance to my contribution will be lower.

"I believe X" is a statement of self-identity - the map of the territory of your mind. But as maps and territories go, self-identity is pretty special, as it is a map written using the territory, and changes in th... (read more)

I chose to believe in the model of science—deliberately and consciously. This decision, however, has absolutely zero effect on the actual scientific method. I choose to believe science not because I can show it to be likely true, but simply because it is useful for making accurate predictions. I choose to reject, at least in so far as my actions, my internal beliefs about how the world works when they conflict with the ways science says the world works. I reject my intuition and all my firsthand experience that velocity is additive because relativity says ... (read more)

1hairyfigment
Not to put too fine a point on it, but you sound like you already expect science's predictions for velocity to come true before you "choose to reject old beliefs". If someone asked you beforehand to bet on whether your intuitions or science would pan out here (in those words), you'd bet on science. I sometimes feel (less often now) that if I 'follow the rules' nothing really bad can happen to me. I try to fight this feeling because even my own sheltered life suggests its predictions would fail eventually. ETA: alief.

It seems to me that you are confused.

There are two kinds of belief being discussed here: abstract/declarative and concrete/imperative.

We don't have direct control over our imperative beliefs, but can change them through clever self-manipulation. We DO have direct control over our declarative beliefs, and we can think whatever the heck we want in them. They just won't necessarily make any difference to how we BEHAVE, since they're part of the "far" or "social" thinking mechanism.

You seem to be implying that there's only one kind of bel... (read more)

1abigailgem
"The monster will get me if I make a mistake" can be a deep concrete belief, one looks at it rationally, and thinks, that is ridiculous- but getting rid of it can be hard work.

If you know your belief isn't correlated to reality, how can you still believe it?

 

Interestingly, physics models (map) are wrong (inaccurate) and people know that but still use them all the time because they are good enough with respect to some goal.

Less accurate models can even be favored over more accurate ones to save on computing power or reduce complexity.

As long as the benefits outweigh the drawbacks, the correlation to reality is irrelevant.

Not sure how cleanly this maps to beliefs since one would have to be able to go from one belief to anothe... (read more)

I'm going to go off the assumption that this post is deliberate satire, and say it's brilliant.

"Even if it's not true, I'm going to decide to believe that people can't sincerely self-deceive."

[-][anonymous]10

All people have a marked preference to believe what they want to believe, especially when there are no direct costs associated with the false belief. The majority therefore prefers the belief in a charitable high power to the uncaring universe guided solely by the laws of physics.

The fact that a minority made by the self-declared rationalists can get by without this belief may have less to do with their rationalism than with the warm feeling of the superiority they feel towards the rest of the mankind. This can at least in part console them for giving up religion. Personally I get my consolation from feeling superior to both groups.

All people have a marked preference to believe what they want to believe, especially when there are no direct costs associated with the false belief. The majority therefore prefers the belief in a charitable high power to the uncaring universe guided solely by the laws of physics.

The fact that a minority made by the self-declared rationalists can get by without this belief may have less to do with their rationalism than with the warm feeling of the superiority they feel towards the rest of the mankind. This can at least in part console them for giving up religion. Personally I get my consolation from feeling superior to both groups.

[-][anonymous]10

I chose to believe in the existence of God - deliberately and consciously. This decision, however, has absolutely zero effect on the actual existence of God.

If you know your belief isn't correlated to reality, how can you still believe it?

A good question. Perhaps it could be distance a little more from quote that preceeds it? That quote by itself seems to be rational. (The irrational basis of the deliberate and conscious choice in question is nearly guarunteed but at least out of context.)

[-][anonymous]00

What you're saying has supercharged my cognitive flexibility. I never even thought to check whether my self-reported beliefs correlate with thoughts that I have positive affect towards and examine the implications!

Reminds me of Journeyman's comment on my EA article:

I don’t think EAs do a very good job of distinguishing their moral intuitions from good philosophical arguments; see the interest of many EAs in open borders and animal rights. I do not see a large understanding in EA of what altruism is and how it can become pathological. Pathological altruis

... (read more)

"'I chose to believe in the existence of God—deliberately and consciously. This decision, however, has absolutely zero effect on the actual existence of God.'

If you know your belief isn't correlated to reality, how can you still believe it?"

It's the difference between someone who's afraid of heights standing twenty feet from a cliff and standing two inches from the cliff. The former knows what will happen if he moves over and looks down, the latter is looking down and feeling the fear.

If you tell yourself you believe in a wall, then you're less likely to worry about what's on the other side.

If you keep telling yourself that you can't just deliberately choose to believe the sky is green—then you're less likely to succeed in fooling yourself on one level or another; either in the sense of really believing it, or of falling into Moore's Paradox, belief in belief, or belief in self-deception.

If you keep telling yourself that you'd just look at your elaborately constructed false map, and just know that it was a false map without any expected correlation to the territory, and therefore, despite all its elaborate construction, you wouldn't be able

... (read more)

I thought about believing that people are nicer than they really are before reading this and the previous article and I was worried I did that thing where I believed I succeeded in deceiving myself. Then I unpacked it to be "it is beneficial to act like you expect the next person you meet to be nice because if you believe that they are likely to turn out mean then you will start acting as if you expect them to be a jerk, which is more likely to make them act like a jerk; therefore just act as if you already think they're nice but be prepared to approp... (read more)

0Qiaochu_Yuan
Are you suggesting a strategy different from "default to acting nice to people"? You can justify this strategy without phrasing it in terms of acting as if you have a belief you don't have. Probably, but as someone who reads LW, you hopefully recognize that you can just do those things anyway without making any statements about your beliefs.
0jooyous
Oops, sorry! I am suggesting the strategy "continue meeting strangers and being nice to them" for the problem of finding nice people. As opposed to "after meeting 5 jerks in a row, conclude that everyone is a jerk and hide from humans forever." Exactly! And I think I phrased it more or less this way when I computed it for personal use. But I keep encountering people who try to argue that there's no point of meeting the next person because the past 5 people they've talked to turned out to be jerks. And I think arguing with those people turned my argument into "Well you shouldn't BELIEVE the next person is going to be a jerk because that's probably skewing your data." Which isn't quite what I meant; it just got stuck in my head in that flawed form. I wasn't trying to get them to believe their data away; I was trying to get them to act nice in spite of it. =P Yeah, I was trying to sorta-direct the comment at the person mentioned in the body of the post who is probably long gone by now. I was wondering if there was a "useful action" component in their desire to keep a God node around in their head that they consciously keep from melting away.

See... beliefs are emotional statements rooted heavily in cultural heritage and instinct. overcoming them is difficult. So for example no matter how hard I stand in the cockpit screaming at myself that I'm doing something stupid, I still react with a fear response to frightening images shown on a movie screen.

Though I guess the problem here is a definitional one. You define belief a bit more narrowly then I do, so I'm quibbling. I feel the need to bring this up (for your consideration), but I'm not going to pursue it. I'm probably being stupid even bringing it up.

Plenty of people, including myself, seem to understand that they are risk-averse, and yet fail to seek risk-neutrality.

       Tell yourself the effort is doomed - and it will be!

@Eliezer: People are going to misinterpret this far too frequently. Add an addendum to the post to clarify it.