I don't mean to seem like I'm picking on Kurige, but I think you have to expect a certain amount of questioning if you show up on Less Wrong and say:
One thing I've come to realize that helps to explain the disparity I feel when I talk with most other Christians is the fact that somewhere along the way my world-view took a major shift away from blind faith and landed somewhere in the vicinity of Orwellian double-think.
I realize that my views do not agree with the large majority of those who frequent LW and OB - but I'd just like to take a moment to recognize that it's a testament to this community that:
A) There have been very few purely emotional or irrational responses.
B) Of those that fall into (A) all have been heavily voted down.
I hope that Kurige comes back to verify this, but I'll bet that when he said
I chose to believe in the existence of God - deliberately and consciously. This decision, however, has absolutely zero effect on the actual existence of God.
he did not mean, "My belief isn't correlated with reality". Rather, I'll bet, he meant exactly what you meant when you said
telling yourself X doesn't make X true
By saying that his choice had no effect on reality, I expect that he meant that his control over his belief did not entail control over the subject of that belief, i.e., the fact of the matter.
His attribution of Orwellian doublethink to himself is far more confusing. I have no idea what to make of that. Maybe your advice in this post is on point there. But the "absolutely zero effect" quote seems unobjectionable.
"When it comes to deliberate self-deception, you must believe in your own inability!"
That is both contrary to facts, and a pretty effective way to ensure that we won't search for and find examples where we've been deceiving ourselves. Without that search, self-correction is impossible.
"Tell yourself the effort is doomed - and it will be!"
Tell yourself that victory is assured, and failure becomes certain.
One question here obviously concerns doxastic voluntarism (DV). You ask:
"If you know your belief isn't correlated to reality, how can you still believe it?"
Is this a rhetorical question aiming to assert that if you know your belief isn't correlated to reality, you can't still believe it"?
If so, then it just isn't clear that you're right. One possibility is that DV is true (there are, of course, many reasons to believe that it is). And, if DV is true, it's likely that different people have different degrees and kinds of control over their beliefs. After all, people differ with regard to all other known cognitive skills. Some irrational folks simply might have a kind of control over their beliefs that others don't have. That's an empirical question. (Though we normally think that folks who are more rational have greater control over their beliefs.)
You might, however, mean: if you know your belief isn't correlated to reality, you shouldn't still believe it.
That's a normative claim, not an empirical, psychological one. If that's what you mean, then you're in effect expressing surprise that anyone can be that irrational. If so, I guess I'm a little surprised at your surprise. It is a fairly pure case, but it seems to me that it's not that unusual to hear things like this.
When it comes to deliberate self-deception, you must believe in your own inability!
Tell yourself the effort is doomed - and it will be!
Is that the power of positive thinking, or the power of negative thinking? Either way, it seems like a wise precaution.
The positive power of negative thinking. There is a book waiting to happen. Scratch that, google tells me the title is already taken. Either way, the idea is fascinating.
Just what is the difference between deceiving yourself and 'positive thinking'? It is clear that Eleizer advocates telling yourself things that may not actually be true. You may tell yourself "I cannot believe what I know is not true". In some cases you may know yourself well enought to estimate that there is only a 40% chance that the claim could ever reasonably qualify as true no matter how dilligent your pep-talking may be, yet it may still be worth a try. On first glance that seems like it is 60% self deception. Yet there is some sort of difference.
When we go about affirming to ourself that "I am charming, assertive, have an overwhelming instinct to maintain reflective consistency and am irresistible to the opposite sex" we are not so muc...
I was under the impression that Doublethink involved contradictory ideas, Kurige seem to be talking about descriptions that are not inherently contradictory.
On the subject of not being able to update, I know of an anorexic who claims that even if she were rail-thin, she would be a fat person in a thin body. The knowledge of thinness does not affect the internal concept of self-fatness. (probably formed during childhood)
http://lesswrong.com/lw/r/no_really_ive_deceived_myself/gl I don't think I'd call my situation self deception. I am not making myself &q...
I chose to believe in the existence of God - deliberately and consciously. This decision, however, has absolutely zero effect on the actual existence of God.
If you know your belief isn't correlated to reality, how can you still believe it?
To be fair, he didn't say that the actual existence of God has absolutely zero effect on his decision to believe in the existence of God.
His acknowledgement that the map has no effect on the territory is actually a step in the right direction, even though he has many more steps to go.
It seems to me you are trying to deceive yourself into thinking that you cannot comfortably self-deceive. Your effort may indeed make it harder to self-deceive, but I doubt it changes your situation all that much. Admit it, you are human, and within the usual human range of capabilities and tendencies for self-deception.
" Kurige: One thing I've come to realize that helps to explain the disparity I feel when I talk with most other Christians is the fact that somewhere along the way my world-view took a major shift away from blind faith and landed somewhere in the vicinity of Orwellian double-think."
It sounds like you don't really believe that double-think is impossible; you just have belief in belief in the impossibility of double-think, because you think that belief would be a useful one to have.
As soon as you start "trying to instill a self-fulfilling prophecy", you're going down the same road as the people who say "I believe in God because I think it's useful to have a belief in God."
To be clear, if you're trying to make it impossible for yourself to double-think by planting that thought in your head, that may be a rational st...
Maybe we need to split this into two words. Belief for when it is not supported by fact, or even against the evidence. I mean I've never heard anybody say, "I believe in gravity". Maybe use the phrase "I accept" for supported ideas, as in "I accept quantum mechanics" or "I accept that god does not exist". "Accept" also seems to have less affect than "believe", which may make it easier to change your mind if the evidence changes.
If I am capable of deliberate self deception, I want to believe that I am capable of deliberate self deception.
If I am not capable of deliberate self deception, I want to believe that I am not capable of deliberate self deception.
A real-world instance of Moore’s Paradox (“It’s raining, but I don’t believe it is”) occurs several times annually at Autzen Stadium in Eugene, Oregon —
https://en.m.wikipedia.org/wiki/Autzen_Stadium
[quote:]
Since 1990, Don Essig, the stadium's PA announcer since 1968, has declared that "It never rains at Autzen Stadium" before each home game as the crowd chants along in unison. He often prefaces it with the local weather forecast, which quite often includes some chance of showers, but reminds fans that "we know the real forecast..." or "let's tell our f
...I find it amusing that in this article, you are advocating the use of deliberate self-deception in order to ward yourself against later deliberate self-deception.
That said, I feel the urge to contribute despite the large time-gap, and I suspect that even if later posts revisit this concept, the relevance to my contribution will be lower.
"I believe X" is a statement of self-identity - the map of the territory of your mind. But as maps and territories go, self-identity is pretty special, as it is a map written using the territory, and changes in th...
I chose to believe in the model of science—deliberately and consciously. This decision, however, has absolutely zero effect on the actual scientific method. I choose to believe science not because I can show it to be likely true, but simply because it is useful for making accurate predictions. I choose to reject, at least in so far as my actions, my internal beliefs about how the world works when they conflict with the ways science says the world works. I reject my intuition and all my firsthand experience that velocity is additive because relativity says ...
It seems to me that you are confused.
There are two kinds of belief being discussed here: abstract/declarative and concrete/imperative.
We don't have direct control over our imperative beliefs, but can change them through clever self-manipulation. We DO have direct control over our declarative beliefs, and we can think whatever the heck we want in them. They just won't necessarily make any difference to how we BEHAVE, since they're part of the "far" or "social" thinking mechanism.
You seem to be implying that there's only one kind of bel...
If you know your belief isn't correlated to reality, how can you still believe it?
Interestingly, physics models (map) are wrong (inaccurate) and people know that but still use them all the time because they are good enough with respect to some goal.
Less accurate models can even be favored over more accurate ones to save on computing power or reduce complexity.
As long as the benefits outweigh the drawbacks, the correlation to reality is irrelevant.
Not sure how cleanly this maps to beliefs since one would have to be able to go from one belief to anothe...
I'm going to go off the assumption that this post is deliberate satire, and say it's brilliant.
"Even if it's not true, I'm going to decide to believe that people can't sincerely self-deceive."
All people have a marked preference to believe what they want to believe, especially when there are no direct costs associated with the false belief. The majority therefore prefers the belief in a charitable high power to the uncaring universe guided solely by the laws of physics.
The fact that a minority made by the self-declared rationalists can get by without this belief may have less to do with their rationalism than with the warm feeling of the superiority they feel towards the rest of the mankind. This can at least in part console them for giving up religion. Personally I get my consolation from feeling superior to both groups.
All people have a marked preference to believe what they want to believe, especially when there are no direct costs associated with the false belief. The majority therefore prefers the belief in a charitable high power to the uncaring universe guided solely by the laws of physics.
The fact that a minority made by the self-declared rationalists can get by without this belief may have less to do with their rationalism than with the warm feeling of the superiority they feel towards the rest of the mankind. This can at least in part console them for giving up religion. Personally I get my consolation from feeling superior to both groups.
I chose to believe in the existence of God - deliberately and consciously. This decision, however, has absolutely zero effect on the actual existence of God.
If you know your belief isn't correlated to reality, how can you still believe it?
A good question. Perhaps it could be distance a little more from quote that preceeds it? That quote by itself seems to be rational. (The irrational basis of the deliberate and conscious choice in question is nearly guarunteed but at least out of context.)
What you're saying has supercharged my cognitive flexibility. I never even thought to check whether my self-reported beliefs correlate with thoughts that I have positive affect towards and examine the implications!
Reminds me of Journeyman's comment on my EA article:
...I don’t think EAs do a very good job of distinguishing their moral intuitions from good philosophical arguments; see the interest of many EAs in open borders and animal rights. I do not see a large understanding in EA of what altruism is and how it can become pathological. Pathological altruis
"'I chose to believe in the existence of God—deliberately and consciously. This decision, however, has absolutely zero effect on the actual existence of God.'
If you know your belief isn't correlated to reality, how can you still believe it?"
It's the difference between someone who's afraid of heights standing twenty feet from a cliff and standing two inches from the cliff. The former knows what will happen if he moves over and looks down, the latter is looking down and feeling the fear.
If you tell yourself you believe in a wall, then you're less likely to worry about what's on the other side.
...If you keep telling yourself that you can't just deliberately choose to believe the sky is green—then you're less likely to succeed in fooling yourself on one level or another; either in the sense of really believing it, or of falling into Moore's Paradox, belief in belief, or belief in self-deception.
If you keep telling yourself that you'd just look at your elaborately constructed false map, and just know that it was a false map without any expected correlation to the territory, and therefore, despite all its elaborate construction, you wouldn't be able
I thought about believing that people are nicer than they really are before reading this and the previous article and I was worried I did that thing where I believed I succeeded in deceiving myself. Then I unpacked it to be "it is beneficial to act like you expect the next person you meet to be nice because if you believe that they are likely to turn out mean then you will start acting as if you expect them to be a jerk, which is more likely to make them act like a jerk; therefore just act as if you already think they're nice but be prepared to approp...
See... beliefs are emotional statements rooted heavily in cultural heritage and instinct. overcoming them is difficult. So for example no matter how hard I stand in the cockpit screaming at myself that I'm doing something stupid, I still react with a fear response to frightening images shown on a movie screen.
Though I guess the problem here is a definitional one. You define belief a bit more narrowly then I do, so I'm quibbling. I feel the need to bring this up (for your consideration), but I'm not going to pursue it. I'm probably being stupid even bringing it up.
Plenty of people, including myself, seem to understand that they are risk-averse, and yet fail to seek risk-neutrality.
Tell yourself the effort is doomed - and it will be!
@Eliezer: People are going to misinterpret this far too frequently. Add an addendum to the post to clarify it.
I don't mean to seem like I'm picking on Kurige, but I think you have to expect a certain amount of questioning if you show up on Less Wrong and say:
"If you know it's double-think...
...how can you still believe it?" I helplessly want to say.
Or:
If you know your belief isn't correlated to reality, how can you still believe it?
Shouldn't the gut-level realization, "Oh, wait, the sky really isn't green" follow from the realization "My map that says 'the sky is green' has no reason to be correlated with the territory"?
Well... apparently not.
One part of this puzzle may be my explanation of Moore's Paradox ("It's raining, but I don't believe it is")—that people introspectively mistake positive affect attached to a quoted belief, for actual credulity.
But another part of it may just be that—contrary to the indignation I initially wanted to put forward—it's actually quite easy not to make the jump from "The map that reflects the territory would say 'X'" to actually believing "X". It takes some work to explain the ideas of minds as map-territory correspondence builders, and even then, it may take more work to get the implications on a gut level.
I realize now that when I wrote "You cannot make yourself believe the sky is green by an act of will", I wasn't just a dispassionate reporter of the existing facts. I was also trying to instill a self-fulfilling prophecy.
It may be wise to go around deliberately repeating "I can't get away with double-thinking! Deep down, I'll know it's not true! If I know my map has no reason to be correlated with the territory, that means I don't believe it!"
Because that way—if you're ever tempted to try—the thoughts "But I know this isn't really true!" and "I can't fool myself!" will always rise readily to mind; and that way, you will indeed be less likely to fool yourself successfully. You're more likely to get, on a gut level, that telling yourself X doesn't make X true: and therefore, really truly not-X.
If you keep telling yourself that you can't just deliberately choose to believe the sky is green—then you're less likely to succeed in fooling yourself on one level or another; either in the sense of really believing it, or of falling into Moore's Paradox, belief in belief, or belief in self-deception.
If you keep telling yourself that deep down you'll know—
If you keep telling yourself that you'd just look at your elaborately constructed false map, and just know that it was a false map without any expected correlation to the territory, and therefore, despite all its elaborate construction, you wouldn't be able to invest any credulity in it—
If you keep telling yourself that reflective consistency will take over and make you stop believing on the object level, once you come to the meta-level realization that the map is not reflecting—
Then when push comes to shove—you may, indeed, fail.
When it comes to deliberate self-deception, you must believe in your own inability!
Tell yourself the effort is doomed—and it will be!
Is that the power of positive thinking, or the power of negative thinking? Either way, it seems like a wise precaution.