Less Wrong is a community blog devoted to refining the art of human rationality. Please visit our About page for more information.

Don't Believe You'll Self-Deceive

15 Post author: Eliezer_Yudkowsky 09 March 2009 08:03AM

Followup toMoore's Paradox, Doublethink

I don't mean to seem like I'm picking on Kurige, but I think you have to expect a certain amount of questioning if you show up on Less Wrong and say:

One thing I've come to realize that helps to explain the disparity I feel when I talk with most other Christians is the fact that somewhere along the way my world-view took a major shift away from blind faith and landed somewhere in the vicinity of Orwellian double-think.

"If you know it's double-think...

...how can you still believe it?" I helplessly want to say.

Or:

I chose to believe in the existence of God—deliberately and consciously. This decision, however, has absolutely zero effect on the actual existence of God.

If you know your belief isn't correlated to reality, how can you still believe it?

Shouldn't the gut-level realization, "Oh, wait, the sky really isn't green" follow from the realization "My map that says 'the sky is green' has no reason to be correlated with the territory"?

Well... apparently not.

One part of this puzzle may be my explanation of Moore's Paradox ("It's raining, but I don't believe it is")—that people introspectively mistake positive affect attached to a quoted belief, for actual credulity.

But another part of it may just be that—contrary to the indignation I initially wanted to put forward—it's actually quite easy not to make the jump from "The map that reflects the territory would say 'X'" to actually believing "X".  It takes some work to explain the ideas of minds as map-territory correspondence builders, and even then, it may take more work to get the implications on a gut level.

I realize now that when I wrote "You cannot make yourself believe the sky is green by an act of will", I wasn't just a dispassionate reporter of the existing facts.  I was also trying to instill a self-fulfilling prophecy.

It may be wise to go around deliberately repeating "I can't get away with double-thinking!  Deep down, I'll know it's not true!  If I know my map has no reason to be correlated with the territory, that means I don't believe it!"

Because that way—if you're ever tempted to try—the thoughts "But I know this isn't really true!" and "I can't fool myself!" will always rise readily to mind; and that way, you will indeed be less likely to fool yourself successfully.  You're more likely to get, on a gut level, that telling yourself X doesn't make X true: and therefore, really truly not-X.

If you keep telling yourself that you can't just deliberately choose to believe the sky is green—then you're less likely to succeed in fooling yourself on one level or another; either in the sense of really believing it, or of falling into Moore's Paradox, belief in belief, or belief in self-deception.

If you keep telling yourself that deep down you'll know—

If you keep telling yourself that you'd just look at your elaborately constructed false map, and just know that it was a false map without any expected correlation to the territory, and therefore, despite all its elaborate construction, you wouldn't be able to invest any credulity in it—

If you keep telling yourself that reflective consistency will take over and make you stop believing on the object level, once you come to the meta-level realization that the map is not reflecting—

Then when push comes to shove—you may, indeed, fail.

When it comes to deliberate self-deception, you must believe in your own inability!

Tell yourself the effort is doomed—and it will be!

Is that the power of positive thinking, or the power of negative thinking?  Either way, it seems like a wise precaution.

 

Part of the Against Doublethink subsequence of How To Actually Change Your Mind

Next post: "The Proper Use of Humility" (in next subsequence)

Previous post: "Moore's Paradox"

Comments (54)

Comment deleted 09 March 2009 09:53:04AM [-]
Comment author: zaph 09 March 2009 04:56:19PM *  2 points [-]

Sorry for the pedantry, but I believe that's Philip K. Dick's quote.

To the "sky is green" idea, I'd counter that the verification path might not work for converting people to atheism. Mormons for instance, suggest to people they will feel a burning in their heart when they read the Book of Mormon, which proves the books veracity. You need to logically piece together that any such physical sensation wouldn't be sufficient to objectively verify anything. There isn't an easy falsification of religious/magical thinking, just following chains of inference from observation. Non-believers just make a commitment to the minimal contortion of facts to fit their paradigm. As obvious as the Silence seems to be, some people don't seem to hear it.

Comment author: Tyrrell_McAllister 09 March 2009 06:17:37PM *  8 points [-]

It is clear that Eleizer advocates telling yourself things that may not actually be true.

I don't think so. He is advocating telling yourself something on the condition that telling it to yourself causes it to be true.

It's not equivalent to telling yourself "I'm attractive to the opposite sex." Say that you doubted this prior to uttering it. Then, yes, after uttering it, you might have reason to think that it is marginally more likely to be true. But you almost certainly wouldn't be justified in believing it with high confidence. That is, you still shouldn't believe the statement, so telling it to yourself is dishonest.

In contrast, Eliezer is suggesting that perhaps regularly uttering the statement

I can't get away with double-thinking! Deep down, I'll know it's not true! If I know my map has no reason to be correlated with the territory, that means I don't believe it!

does alter you so as to make itself true. If that's right, then, conditioned on your having uttered it, you are justified in believing what you uttered, so you are not being dishonest.

It's not a matter of being outside of reality. The utterance is part of reality. That's precisely why it may have the power to cause itself to be true.

Of course, it may be that this particular statement just doesn't have that power. If the probability of that were above a certain threshold, I expect that Eliezer wouldn't advocate saying it unless it's true already.

Comment author: pjeby 09 March 2009 07:45:38PM -1 points [-]

Dishonest or not, convincing yourself that you're attractive to the opposite sex is more likely to produce a positive result. And a rationalist should win. ;-)

Comment author: HalFinney 09 March 2009 10:26:41PM 3 points [-]

What evidence is there that yelling at yourself like this is going to make a difference? Let us imagine two kinds of people: those who cannot fall into Moore's paradox (believing the map but not the territory) and those who can. People in the first class, who are immune to the problem, will gain no benefit from reciting these mantras. People in the second class, for whom there is a real risk of making these kinds of errors, are supposed to vigorously tell themselves that there is no such risk! They are supposed to lie to themselves in the hope that the lie will become true. But why should they believe it?

And how different is this lie, really, from the wannabe god-worshiper who similarly insists to himself that he believes that god exists, even though it is not true?

I can't help wondering whether this posting is meant to be ironic. It comes perilously close to outright self-contradiction.

Comment author: AnnaSalamon 09 March 2009 10:47:56PM 4 points [-]

Hal, perhaps Eliezer's view is that there are "suggestible" portions of one's mind that it is okay to suggest things to, but there is some other, reason-capable faculty that one can and should use to form true, un-self-deceived, evidence before bottom line, beliefs.

Whether or not that's Eliezer's view, the above view seems right to me. It would be silly not to suggest useful frames, emotional stances, energy levels, etc. to the less rational parts of myself -- that would leave me freezing in particular, arbitrary/chance/un-useful starting states. But for the part of myself that can do full cost-benefit analyses, and math, and can assemble my best guess about the world -- misleading that part of myself would be terrifying, like putting my eyes out. (I mean, I deceive the reason-capable part of myself all the time, like most humans. But it's terrifying that I do, and I really really want to do otherwise... including by suggestibility tricks, if they turn out to help.)

Comment author: Eliezer_Yudkowsky 09 March 2009 11:16:52PM 4 points [-]

Tyrrell and Anna have stated my views better than I'd previously gone so far as verbalizing.

There are large sectors of the mind in which belief tends to become reality, including important things like "I am the sort of person who continues even in the face of adversity" and "I do have the willpower to pass up that cookie."

But - given that you aren't actually trying to fool yourself - there's a chicken-and-egg aspect that depends on your having enough potential in this area that you can legitimately believe the statement will become true if you believe it. At that point, you can believe it and then it will be true.

There's an interesting analogy here to Lob's Theorem which I haven't yet categorized as legitimate or fake.

To look at it another way, this sort of thing is useful for taking simultaneous steps of self-confidence and actual capability in cases where the two move in lockstep. Or, in the case of anti-competencies like doublethink, the reverse.

Comment author: abigailgem 10 March 2009 09:43:44AM *  5 points [-]

"I have the potential to be the sort of person who continues even in the face of adversity", or "it is more in my interests to pass up that cookie", or "I really do have a choice whether or not to pass up that cookie". That is what I would recommend.

bill, below, has mentioned "Act as if": "I choose to Act as If I can continue even in the face of adversity, and I intend in this precise moment to continue acting, even if I may just fall down again in two minutes' time".

These have the advantages of being more likely to be true.

Rambling on a little, to be the sort of person who continues in the face of adversity is Difficult, and requires practice, and that practice is very worthwhile. Stating that it is True might make you fail to do the practice, and instead beat yourself up when it appears not to be true.

Comment deleted 09 March 2009 12:49:01PM *  [-]
Comment author: billswift 09 March 2009 12:56:07PM 5 points [-]

Maybe we need to split this into two words. Belief for when it is not supported by fact, or even against the evidence. I mean I've never heard anybody say, "I believe in gravity". Maybe use the phrase "I accept" for supported ideas, as in "I accept quantum mechanics" or "I accept that god does not exist". "Accept" also seems to have less affect than "believe", which may make it easier to change your mind if the evidence changes.

Comment author: bill 09 March 2009 02:51:30PM 7 points [-]

"Act as if" might work.

For example, I act as if people are nicer than they are (because it gets me better outcomes than other possible strategies I've tried).

This also has the benefit of clearly separating action (what we can do) from information (what we know) and preferences (what we want).

Comment author: thomblake 09 March 2009 03:14:05PM 1 point [-]

No, in ordinary English, 'believe' means believe - but it also means 'accept' or 'endorse' or various other sorts of things. If we're going to be entrusted with eradicating a common usage (ha) then I say let 'believe' only mean believe. Thus, here, the assertion "I believe X" should be taken to be equivalent to the assertion "X".

Comment author: Annoyance 09 March 2009 04:03:41PM 0 points [-]

"Thus, here, the assertion "I believe X" should be taken to be equivalent to the assertion "X"."

We can believe something without asserting it to be true. "I assert X to be true", likewise, doesn't require that we believe X to be true. All sorts of arguments involve assertions of truth that we don't necessarily extend beyond the argument.

It's something like the empty set: when the null symbol is bracketed, the result doesn't mean "the empty set". Empty brackets, or the null by itself, means that.

Comment author: thomblake 10 March 2009 04:05:48PM 1 point [-]

Asserting something one does not believe is lying. By the principle of charity we should assume our fellows are not lying, in which case "X" implies "I believe X". Obviously, that's only halfway to equivalence.

If I were to say, "I believe that the president is John McCain", and you responded by disputing my claim that the president is John McCain, I would be out of line to respond that I had never asserted that the president is John McCain. Similarly for the exchange "I believe that Annoyance is Caledonian" "But I'm not Caledonian" "I didn't say you were".

And so they are equivalent, unless you deny the principle of charity or have a counterexample for my second point.

Comment author: Eliezer_Yudkowsky 09 March 2009 05:02:27PM 1 point [-]

"I accept that..." sounds like it could be useful in a lot of cases.

Consider the more swiftly apparent incoherence:

"I accept that people are nicer than they are."

Maybe this is the word we should've been using all along!

Comment author: Document 09 October 2010 05:58:27AM *  0 points [-]

Other substitutes: "it's clear to me that" and "I recognize".

Comment author: Document 10 October 2010 12:32:41AM 0 points [-]

Currently wondering if synonyms for belief in different contexts should be a page on the wiki.

Comment author: WKnorpp 09 March 2009 01:52:36PM *  8 points [-]

One question here obviously concerns doxastic voluntarism (DV). You ask:

"If you know your belief isn't correlated to reality, how can you still believe it?"

Is this a rhetorical question aiming to assert that if you know your belief isn't correlated to reality, you can't still believe it"?

If so, then it just isn't clear that you're right. One possibility is that DV is true (there are, of course, many reasons to believe that it is). And, if DV is true, it's likely that different people have different degrees and kinds of control over their beliefs. After all, people differ with regard to all other known cognitive skills. Some irrational folks simply might have a kind of control over their beliefs that others don't have. That's an empirical question. (Though we normally think that folks who are more rational have greater control over their beliefs.)

You might, however, mean: if you know your belief isn't correlated to reality, you shouldn't still believe it.

That's a normative claim, not an empirical, psychological one. If that's what you mean, then you're in effect expressing surprise that anyone can be that irrational. If so, I guess I'm a little surprised at your surprise. It is a fairly pure case, but it seems to me that it's not that unusual to hear things like this.

Comment author: RobinHanson 09 March 2009 02:44:05PM 5 points [-]

It seems to me you are trying to deceive yourself into thinking that you cannot comfortably self-deceive. Your effort may indeed make it harder to self-deceive, but I doubt it changes your situation all that much. Admit it, you are human, and within the usual human range of capabilities and tendencies for self-deception.

Comment author: Eliezer_Yudkowsky 09 March 2009 04:59:15PM 1 point [-]

Thus did I carefully write, "cannot deliberately self-deceive", not, "cannot self-deceive".

Comment author: RobinHanson 09 March 2009 05:21:21PM 7 points [-]

We have a continuum of degrees of deliberation to our actions. Even if I agree that you cannot self-deceive at the strongest degree of deliberation, that isn't in practice much of a restriction on your ability to self-deceive.

Comment author: Eliezer_Yudkowsky 09 March 2009 06:26:48PM *  4 points [-]

Might seem that way to you because you don't actually go around all day saying, "And now I shall doublethink myself into believing X!" Deliberate self-deception is a subset of self-deception well worth slicing off the carcass. E.g. Utilitarian from OB.

Just because the boundary of deliberate self-deception is fuzzy, doesn't mean the boundary is not worth drawing. The more so in this particular case, as if you wonder "Is this a deliberate self-deception that I can't get away with, or a non-deliberate one that I might still be able to pull off?" it has already reached the point of being deliberate. (Repeating this to yourself will make it even more true.)

Comment author: Yasser_Elassal 09 March 2009 03:36:26PM 8 points [-]

I chose to believe in the existence of God - deliberately and consciously. This decision, however, has absolutely zero effect on the actual existence of God.

If you know your belief isn't correlated to reality, how can you still believe it?

To be fair, he didn't say that the actual existence of God has absolutely zero effect on his decision to believe in the existence of God.

His acknowledgement that the map has no effect on the territory is actually a step in the right direction, even though he has many more steps to go.

Comment author: Annoyance 09 March 2009 04:06:04PM 7 points [-]

"When it comes to deliberate self-deception, you must believe in your own inability!"

That is both contrary to facts, and a pretty effective way to ensure that we won't search for and find examples where we've been deceiving ourselves. Without that search, self-correction is impossible.

"Tell yourself the effort is doomed - and it will be!"

Tell yourself that victory is assured, and failure becomes certain.

Comment author: cleonid 09 March 2009 04:57:35PM 0 points [-]

All people have a marked preference to believe what they want to believe, especially when there are no direct costs associated with the false belief. The majority therefore prefers the belief in a charitable high power to the uncaring universe guided solely by the laws of physics.

The fact that a minority made by the self-declared rationalists can get by without this belief may have less to do with their rationalism than with the warm feeling of the superiority they feel towards the rest of the mankind. This can at least in part console them for giving up religion. Personally I get my consolation from feeling superior to both groups.

Comment author: Tyrrell_McAllister 09 March 2009 06:31:42PM 16 points [-]

I hope that Kurige comes back to verify this, but I'll bet that when he said

I chose to believe in the existence of God - deliberately and consciously. This decision, however, has absolutely zero effect on the actual existence of God.

he did not mean, "My belief isn't correlated with reality". Rather, I'll bet, he meant exactly what you meant when you said

telling yourself X doesn't make X true

By saying that his choice had no effect on reality, I expect that he meant that his control over his belief did not entail control over the subject of that belief, i.e., the fact of the matter.

His attribution of Orwellian doublethink to himself is far more confusing. I have no idea what to make of that. Maybe your advice in this post is on point there. But the "absolutely zero effect" quote seems unobjectionable.

Comment author: kurige 10 March 2009 09:15:45AM *  5 points [-]

His attribution of Orwellian doublethink to himself is far more confusing. I have no idea what to make of that. Maybe your advice in this post is on point there. But the "absolutely zero effect" quote seems unobjectionable.

From the original comment:

One thing I've come to realize that helps to explain the disparity I feel when I talk with most other Christians is the fact that somewhere along the way my world-view took a major shift away from blind faith and landed somewhere in the vicinity of Orwellian double-think.

I don't have the original text handy, but a quick search on wikipedia brings up this quote from the book defining the concept:

The power of holding two contradictory beliefs in one's mind simultaneously, and accepting both of them. … To tell deliberate lies while genuinely believing in them, to forget any fact that has become inconvenient, and then, when it becomes necessary again, to draw it back from oblivion for just so long as it is needed, to deny the existence of objective reality and all the while to take account of the reality which one denies.

The first sentence and the first sentence alone is the definition I had in my mind when I wrote the comment. It has been quite a while since I last read 1984 and I had forgotten the connotation that to "double-think" is to "deny the existence of objective reality." This was not my intention at all, although, upon reflection, it should have been obvious.

This was bad homework on my part; I should have looked the quote up before writing the comment. Instead of focusing on the example of morality that I used in the original comment I'm going to try to step back a bit to clarify my original point... Instead of blind-faith in religious tenants, my world-view currently accommodates two traditionally exclusive systems of belief: religion and science.

These two beliefs are not contradictory, but the complexity lies in reconciling the two.

If one does not agree with the other then my understanding of one or the other is flawed.

Comment author: Tyrrell_McAllister 10 March 2009 04:48:24PM 3 points [-]

Okay, so, when you say that you engage in "doublethink", do you mean that you simultaneously hold two beliefs that are currently "unreconciled", and which you don't yet know how to reconcile, but which you believe can yet be reconciled?

If that's right, then I would be curious to know more about this "unreconciled" relation. Can you give other example of pairs of "unreconciled" beliefs that you hold?

Comment author: HughRistik 10 March 2009 09:17:36PM 3 points [-]

I'm also having trouble seeing kurige's "doublethink."

The double-think comes into play when you're faced with non-axiomatic concepts such as morality. I believe that there is a God - and that He has instilled a sense of right and wrong in us by which we are able to evaluate the world around us. I also believe a sense of morality has been evolutionarily programmed into us - a sense of morality that is most likely a result of the formation of meta-political coalitions in Bonobo communities a very, very long time ago.

These two beliefs are not contradictory, but the complexity lies in reconciling the two.

As you observe, the beliefs are not contradictory. There are various creative ways of reconciling them, such as deism (e.g. "God started the Big Bang"). Whether these reconciliations are true, or reasonable, is another question. Yet they are internally consistent, so there is no contradiction or double-think.

Instead of blind-faith in religious tenants, my world-view currently accommodates two traditionally exclusive systems of belief: religion and science.

I think that this is the closest to a contradiction you have displayed. It doesn't seem like your form of religion excludes the claims of science, but your version of science may exclude the claims of religion.

If, in your view, science requires the use of Occam's Razor, and you think belief in God violates Occam's Razor (as I do), yet you continue to believe in God, then I think you would be engaging in double-think. Yet if you don't think that Occam's Razor is valid, or you don't think that belief in God violates it, then I wouldn't claim that you were engaging in double-think without additional information.

Comment author: Annoyance 10 March 2009 09:25:27PM 1 point [-]

"There are various creative ways of reconciling them, such as deism (e.g. "God started the Big Bang"). Whether these reconciliations are true, or reasonable, is another question."

If the purported reconciliation isn't reasonable, it's not a reconciliation, just as an asserted solution to a mathematical problem that doesn't match the requirements isn't an actual solution.

If I hit you in the head with a bat, would you accept that God was responsible because your injury wouldn't have occurred if (we presume) the universe had not been set into motion?

Comment author: HughRistik 10 March 2009 10:58:56PM *  4 points [-]

Annoyance said:

If the purported reconciliation isn't reasonable, it's not a reconciliation, just as an asserted solution to a mathematical problem that doesn't match the requirements isn't an actual solution.

First, I'm not sure what you are trying to show by your analogy to a mathematical problem, or by your question.

When I say that beliefs are reconciled, I am talking about internal consistency. Belief systems can be internally consistent without being true or reasonable.

If someone believes X and Y, and they do not contradict each other, then their beliefs are reconciled and internally consistent, even if Y is false or unreasonable. (Unless they hold another belief, Z, which implies that Y is false.)

Being wrong or unreasonable is not necessarily double-think. Do you not agree?

If we take someone who has seemingly internally consistent, but certain demonstrably false or unreasonable beliefs, then we might wonder if we could dig up a contradiction in their beliefs if we dug hard enough. Take, for instance, a theist who turns out to believe Occam's Razor. In this case, the internal consistency of their beliefs falls apart.

Yet even then, this still isn't necessarily double-think. Orwell's definition requires "holding two contradictory beliefs in one's mind simultaneously." If our theist never even thought about their beliefs in God and how they measured up to Occam's Razor, then this would not be double-thinking, it would be lack-of-thinking.

Comment author: Annoyance 11 March 2009 05:32:39PM 1 point [-]

"When I say that beliefs are reconciled, I am talking about internal consistency. Belief systems can be internally consistent without being true or reasonable."

They might not be true, and they might not be reasonable *in regard to a framing system of beliefs and knowledge, but they DO have to be reasonable relative to each other.

Saying that God is responsible for the existence of creation does not imply that everything that happens (including evolutionary processes) was designed by God. Evolutionary development as a concept is incompatible with the concept of intentional design. The two beliefs are not compatible with each other.

Comment author: Amanojack 14 March 2010 05:43:53AM 1 point [-]

Instead of blind-faith in religious tenants, my world-view currently accommodates two traditionally exclusive systems of belief: religion and science.

In other words, it seems you meant "doublethink" in the collective sense based on traditional sentiment, rather than in the actual sense of a logical contradiction between any one specific religious tenet A and any one specific scientific theory B. If there are no actual contradictions, "doublethink" was just an (unfortunate) turn of phrase and there is nothing to be reconciled.

Comment author: pjeby 09 March 2009 07:54:23PM 2 points [-]

It seems to me that you are confused.

There are two kinds of belief being discussed here: abstract/declarative and concrete/imperative.

We don't have direct control over our imperative beliefs, but can change them through clever self-manipulation. We DO have direct control over our declarative beliefs, and we can think whatever the heck we want in them. They just won't necessarily make any difference to how we BEHAVE, since they're part of the "far" or "social" thinking mechanism.

You seem to be implying that there's only one kind of belief, and that it should be subject to some sort of consistency checking. However, NEITHER kind of belief has any global or automatic consistency checking. We can stop intellectually believing that we're dumb or incompetent, for example, and still go on believing it emotionally, because although the abstract memory involved has been updated, the concrete memory hasn't.

It isn't even necessary to DO anything in order to have contradictory beliefs; it merely suffices to neglect the cross-checking, and perhaps a bit of effort to avoid thinking about the connection when somebody tries to show it to you.

And that avoidance can take place automatically, if you have a strong enough emotional reason for wanting to maintain the intellectual belief. Even among my clients who WANT to change some belief or fix some problem in their heads, the first step for me is always getting them to stop abstracting themselves away from actually looking at what they believe on the concrete/emotional level, as opposed to what they'd prefer to believe on the abstract/intellectual level.

Imagine how much harder it must be for someone who isn't TRYING to change their beliefs!

Comment author: abigailgem 10 March 2009 09:48:43AM 1 point [-]

"The monster will get me if I make a mistake" can be a deep concrete belief, one looks at it rationally, and thinks, that is ridiculous- but getting rid of it can be hard work.

Comment author: kurige 10 March 2009 12:31:46AM *  19 points [-]

I don't mean to seem like I'm picking on Kurige, but I think you have to expect a certain amount of questioning if you show up on Less Wrong and say:

One thing I've come to realize that helps to explain the disparity I feel when I talk with most other Christians is the fact that somewhere along the way my world-view took a major shift away from blind faith and landed somewhere in the vicinity of Orwellian double-think.

I realize that my views do not agree with the large majority of those who frequent LW and OB - but I'd just like to take a moment to recognize that it's a testament to this community that:

A) There have been very few purely emotional or irrational responses.

B) Of those that fall into (A) all have been heavily voted down.

Comment author: MichaelBishop 10 March 2009 05:28:01AM 0 points [-]
 Tell yourself the effort is doomed - and it will be!

@Eliezer: People are going to misinterpret this far too frequently. Add an addendum to the post to clarify it.

Comment author: JamesAndrix 11 March 2009 04:40:26PM *  7 points [-]

I was under the impression that Doublethink involved contradictory ideas, Kurige seem to be talking about descriptions that are not inherently contradictory.

On the subject of not being able to update, I know of an anorexic who claims that even if she were rail-thin, she would be a fat person in a thin body. The knowledge of thinness does not affect the internal concept of self-fatness. (probably formed during childhood)

http://lesswrong.com/lw/r/no_really_ive_deceived_myself/gl I don't think I'd call my situation self deception. I am not making myself "believe the sky is green by an act of will." Rather, something in me says the sky is green, and is not dependent on observations of the sky at all.

No matter how much you're committed to updating your map, you'll face a conundrum when you realize you should have made your map round, and that's not something that you can trivially change about your map. You can understand and minimize the distortions, and use different projections in different situations, but you might always be stuck with a flat map. Knowing the territory is round doesn't change the experience you have of looking at a flat map.

Comment author: theotetia 13 March 2009 04:43:40AM 3 points [-]

Wow. I love the flat-vs.-round elaboration of the map metaphor. I had never thought about it that way. My thoughts just got way more interesting. Thanks.

Comment author: EmbraceUnity 21 June 2009 11:05:54PM 0 points [-]

Plenty of people, including myself, seem to understand that they are risk-averse, and yet fail to seek risk-neutrality.

Comment author: BlindDancer 03 April 2011 02:13:00PM *  0 points [-]

See... beliefs are emotional statements rooted heavily in cultural heritage and instinct. overcoming them is difficult. So for example no matter how hard I stand in the cockpit screaming at myself that I'm doing something stupid, I still react with a fear response to frightening images shown on a movie screen.

Though I guess the problem here is a definitional one. You define belief a bit more narrowly then I do, so I'm quibbling. I feel the need to bring this up (for your consideration), but I'm not going to pursue it. I'm probably being stupid even bringing it up.

Comment author: jooyous 10 January 2013 04:59:37AM 0 points [-]

I thought about believing that people are nicer than they really are before reading this and the previous article and I was worried I did that thing where I believed I succeeded in deceiving myself. Then I unpacked it to be "it is beneficial to act like you expect the next person you meet to be nice because if you believe that they are likely to turn out mean then you will start acting as if you expect them to be a jerk, which is more likely to make them act like a jerk; therefore just act as if you already think they're nice but be prepared to appropriately react to evidence that they're a jerk if they present it." Which I think is straight-forward and not contradictory, right? Because it doesn't tell me to believe anything that conflicts with reality, it just tells me how to act.

I'm curious if this maps at all onto the existence of God. Does acting like you believe God exists cause you to do certain good things that you wouldn't do otherwise?

Comment author: Qiaochu_Yuan 10 January 2013 05:07:58AM *  0 points [-]

it is beneficial to act like you expect the next person you meet to be nice

Are you suggesting a strategy different from "default to acting nice to people"? You can justify this strategy without phrasing it in terms of acting as if you have a belief you don't have.

Does acting like you believe God exists cause you to do certain good things that you wouldn't do otherwise?

Probably, but as someone who reads LW, you hopefully recognize that you can just do those things anyway without making any statements about your beliefs.

Comment author: jooyous 10 January 2013 05:15:36AM *  0 points [-]

Are you suggesting a strategy different from "default to acting nice to people"?

Oops, sorry! I am suggesting the strategy "continue meeting strangers and being nice to them" for the problem of finding nice people. As opposed to "after meeting 5 jerks in a row, conclude that everyone is a jerk and hide from humans forever."

You can justify this strategy without phrasing it in terms of acting as if you have a belief you don't have.

Exactly! And I think I phrased it more or less this way when I computed it for personal use. But I keep encountering people who try to argue that there's no point of meeting the next person because the past 5 people they've talked to turned out to be jerks. And I think arguing with those people turned my argument into "Well you shouldn't BELIEVE the next person is going to be a jerk because that's probably skewing your data." Which isn't quite what I meant; it just got stuck in my head in that flawed form. I wasn't trying to get them to believe their data away; I was trying to get them to act nice in spite of it. =P

Does acting like you believe God exists cause you to do certain good things that you wouldn't do otherwise?

Yeah, I was trying to sorta-direct the comment at the person mentioned in the body of the post who is probably long gone by now. I was wondering if there was a "useful action" component in their desire to keep a God node around in their head that they consciously keep from melting away.

Comment author: Indon 03 April 2013 08:04:26PM *  1 point [-]

I find it amusing that in this article, you are advocating the use of deliberate self-deception in order to ward yourself against later deliberate self-deception.

That said, I feel the urge to contribute despite the large time-gap, and I suspect that even if later posts revisit this concept, the relevance to my contribution will be lower.

"I believe X" is a statement of self-identity - the map of the territory of your mind. But as maps and territories go, self-identity is pretty special, as it is a map written using the territory, and changes in the map can affect the territory as a result - though not necessarily in the exactly intended fashion. So even if deliberate self-deception isn't possible, then some approximation of it probably is.

Moreover, I'd like to question the definition of 'belief' in the context. If we place an emphasis, in the concept, of a belief as something that affects one's actions, then there is such a thing as a false belief that someone holds: that is to say, an assumption someone intentionally makes, regardless of its' truth or falsehood, that they use to guide their behavior for external reasons.

That is to say, acting, or role-playing.

I'm rather a believer in cognitive minimalism - that our brains are very uncomplex. So I would assert that the same system that we use to model others' behavior - or to play others' roles - we use for our own self-identity. So when you say, "I believe X", you're effectively saying, "I act as if X is true". And if we use the same system to act like ourselves, to model our own behavior, as we do to model or act like anyone else, then that's most of what the practical impact of a belief is.

What I'm trying to say is that the only difference between acting a certain way and believing a certain thing is that you only do the acting under certain practical conditions - the belief, insofar as a belief is different from an act, is acting in a certain way all the time, for any reason.

Replace "I believe X because..." with "I act as if X is true because..." and I don't think it's confusing anymore. Self-identity modification as a tool is pretty important to human cognition, not just for trying to convince yourself that what you don't think is true, is.

Edit: Actually, I want to amend that last part now that I think on it. I would assert that there is no difference whatsoever; that all reasonable beliefs are contingent. In fact, a big part of acting rationally is about making your beliefs contingent on the truth or falsehood of the object of the belief. Beliefs that aren't based on accuracy are still contingent, just on things like, "This is beneficial to me in some way." And really, a rational belief is similar, it just goes, "I believe X because it is accurate," with the implied addition, "and accuracy is good to have in a belief," so that boils down to a practical reason as well.

Comment author: Decius 10 August 2013 03:46:59AM 1 point [-]

If I am capable of deliberate self deception, I want to believe that I am capable of deliberate self deception.
If I am not capable of deliberate self deception, I want to believe that I am not capable of deliberate self deception.

Comment author: christopherj 30 September 2013 04:56:08PM 1 point [-]

I chose to believe in the model of science—deliberately and consciously. This decision, however, has absolutely zero effect on the actual scientific method. I choose to believe science not because I can show it to be likely true, but simply because it is useful for making accurate predictions. I choose to reject, at least in so far as my actions, my internal beliefs about how the world works when they conflict with the ways science says the world works. I reject my intuition and all my firsthand experience that velocity is additive because relativity says it is not. I reject my intuition and firsthand experience that smaller and smaller particles act like proportionally smaller grains of sand because quantum theory says they behave like waves. I choose to fight every bias I possess as I become aware of it, though I clearly believe and act as if that bias were true when I am not fighting it.

If I cannot choose to reject old beliefs and accept beliefs I do not currently possess, how can I choose to overcome bias or become less wrong?

Comment author: hairyfigment 30 September 2013 05:10:17PM *  0 points [-]

Not to put too fine a point on it, but you sound like you already expect science's predictions for velocity to come true before you "choose to reject old beliefs". If someone asked you beforehand to bet on whether your intuitions or science would pan out here (in those words), you'd bet on science.

I sometimes feel (less often now) that if I 'follow the rules' nothing really bad can happen to me. I try to fight this feeling because even my own sheltered life suggests its predictions would fail eventually.

ETA: alief.

Comment author: Yosarian2 21 January 2014 01:06:04AM *  3 points [-]

It sounds like you don't really believe that double-think is impossible; you just have belief in belief in the impossibility of double-think, because you think that belief would be a useful one to have.

As soon as you start "trying to instill a self-fulfilling prophecy", you're going down the same road as the people who say "I believe in God because I think it's useful to have a belief in God."

To be clear, if you're trying to make it impossible for yourself to double-think by planting that thought in your head, that may be a rational strategy. But don't try to convince yourself that it's impossible for other people to double-think just because you wish that were the case; reality is what it is, not what we would like it to be.