I think your examples are terrible, and in part it's because they're political - but for a somewhat different reason than the one elaborated in Politics is the Mind-killer.
First, there's the mismatch between the problem you're addressing and the problem your examples illustrate. The problem you're addressing is how to make sure your behavior changes to match your updated beliefs. In this problem, your beliefs have already updated due to the weight of the evidence, but for some reason (and your list of plausible reasons is compelling) your habitual behavior fails to reflect this change in your beliefs. However, both your examples aren't about that at all - they're about beliefs not changing in the face of the evidence. Josh Stieber's fellow soldiers did not change their minds about whether they should be in Iraq. Your example actually appears to argue that they should have, if they behaved rationally - but whether or not it's true, there's no relevance to the problem your post addresses. At one point, you're doing a sleight of hand of sorts (unintentionally, I'm sure):
...One of Josh's commanders wound up coming around to Josh's point of view to the extent of being able to agree to di
the story of people failing to account for compelling evidence is by itself a familiar, >ubiquitous, low-status specimen of political propaganda.
In fact, one of the most frequent arguments you encounter as you read political >discussions is the argument that the other side are ignoring obvious facts, and so >failing to behave rationally, because they're blinded by their ideology. To a first >approximation, everyone believes that about everyone else.
It seems to me that many of the arguments made on this site based on or referring to the Politics is the Mind-Killer article are based on extrapolations from a single well-known highly-polarized (essentially) 2-party system, i.e. the USA.
I am from a country with many political parties. No party ever gets more than 50% of the votes, in fact it is rare for any party to get over 20% of the votes. The parties are always forced to form a coalition to make a majority government. This system is not without its flaws, and far be it from me to argue that it is superior to the American system.
Nevertheless, it seems to me that many of the failure modes of 'politics', as often described of this site, are actually failure modes of pres...
There is a second human bias that causes you to cache an unrealistically high summary statistic for how often you change your mind: you think you change your mind, in general, pretty often, but unless you are an expert, highly-practiced rationalist, odds are that you do not. As evidence, try thinking of the last time you changed your mind about something and force yourself to specify what you believed beforehand and what you believed afterward. Me personally, I haven't changed my mind about anything that I can remember since about November 10th, 2010, and I'm sure I've expressed thousands of opinions since then. The odds are long.
It is interesting to hear you say that. I would not go as far as to contradict you but I would be equally unsurprised to find out that I changed my mind more than I thought I did. This too is a human bias that crops up all the time, albeit in different circumstances. People are quite capable of completely changing their beliefs to a new belief that they sincerely believe they had all along.
This is a miscalibration that can go either way depending on which way the ego is pulling at the time.
I found the use of political examples grating, and wish we could enforce the "no politics" guideline more consistently.
Nice.
I would add to your list: choose an appropriate community.
If I wanted to stop/start eating animals, I think the single most effective thing I could do would be to start hanging out in a community of vegetarians/omnivores. (Especially if I considered it the moral/prudent thing to do, though it would work about as well either way.)
Similarly, my social circle is at this point largely polyamorous. My own relationship is not, essentially because neither I nor my husband have any particular interest in inviting a third person into it -- we barely manage to...
Off-topic halachic minutia:
I remember at one point a religious camp counselor caught me using a glowstick on the Sabbath, and advised me to throw the glowstick away, on the theory that kindling a new light on the Sabbath violated the applicable religious laws.
It sounds to me like your camp counselor was ignorant of the actual halachah, but had some vague of how the relevant halachot worked and tried to construct his own rational for them. A glow stick does not produce significant quantities of heat, so a glowstick is probably at most Rabbinically proh...
I don't think I can even begin to comprehend the kind of bizarre law-fetishism that could lead to this runaway ridiculous situation - where the answer to "can I move this candle" is "it's complicated".
I can't speak for anyone else, but I was raised an Orthodox Jew and I basically took to treating it as "normal" in the same sense that any set of arbitrary social rules is "normal." It was no weirder than the rules governing, say, when it was OK to wear a T-shirt and sneakers vs. when it wasn't, or when it was OK to eat the last piece of cake, or whatever.
And I still basically think that. It's not that there's some default state where there aren't any arbitrary rules to follow, against which I can compare the rules of Orthodox Judaism. There are just different cultures, each with its own set of rules.
I suspect that, again as with any set of social norms, the key distinction is between people who are raised with only one such set of norms, compared to people who are raised having to navigate among several. The former group can treat their culture's rules as invisible and default and "common sensical"; the latter group can't get away with that so easily.
Anyone interested in pointing Less Wrong out to Josh Steiber, from the linked slate article? I'll contact the author.
This paragraph:
What is your reference class for predicting your own behavior?
and this one:
I think it probably does help, though, to be a bit of a drama queen.
crossed the line from good to awesome for me. Thanks for the post!
There is a third human bias that causes you to tell yourself that you have successfully changed your mind when you have not really done so. The adherent of the Reformed Church of Dragon leaves the garage door open, and cheerfully admits to anyone who asks that there is probably no such thing as an invisible dragon, yet she is unaccountably cautious about actually parking her car in the garage. Thus it is worth knowing not just how to change your mind, but how to change your habits in response to new information.
Related: The Mystery of the Haunted Rationalist.
What if anything, would convince you to stop (or start) eating animals? Not merely to admit, verbally, that it is an acceptable thing for others to do, or even the moral or prudent thing for you to do, but to actually start trying to do it?
In my case: adequate alternatives. I tried to become a vegetarian once before I succeeded. However, this was before the day I spontaneously woke up one morning with a taste for vegetables (it happened, it was weird), so I ate grilled cheese every day for a few days and then gave up. Later, when I a) liked vegetable...
If you gauge the dosage correctly, the propaganda might nudge your opinion just enough to make you actually adopt the new action that you felt would adequately reflect your new beliefs, but not enough to drive you over the cliff into madness.
This sounds difficult enough to do reliably that I have to question whether it's actually a good tactic.
One thing I think may be helpful, that I've noticed some people here seem to practice; if someone says something to you which makes you think about revising your opinion, tell them so. You'll have forced yourself...
One of the best articles here lately. The first two advices are very good, even if probably not new, but you have formulated the point very persuasively. I would also not worry about the political example: in spite of the mind-killing abilities of politics, the way how you have stated your examples is unlikely to incite a flame war in this community (if it does, I will be afraid that our level of rationality is not much higher than that of average folk, despite our aspirations).
I have a little problem with the third advice, though. I suspect it would not w...
The same troops in the same town confronted with the same evidence that their presence was unwelcome all continued to blame and kill the locals.
Generally when occupying another country or supporting its government with your troops you only care what the frak the locals think in a very limited, well defined and may I say small sense.
If you have decided that all things considered you want your troops in a location that generally takes into account them not wanting you there. The most one can say to local opposition is "noted".
...The theory is t
rationalism
This triggered me posting this article, where I write:
I feel that the term "rationalism", as opposed to "rationality", or "study of rationality", has undesirable connotations.
(Discuss there, not here.)
Off-topic: Meatless (and pattyless) sandwiches are surprisingly good if you load them up with most of the vegetables. I go to Subway a few times a month but haven't had a meat sub there in years.
I think the examples used here are absolutely terrible, and I think they indicate a fundamental flaw underlying this theory. Basically, what you call "irrational" in this context, I'd call "rational but dishonest about its motives."
The purpose of having US troops in an area is not to make the locals happier. I don't see much of a reason military leadership should care about local opinion except insofar as it advances their actual objectives. This is true in both the sense that a mugger shouldn't care about his target's feelings, and a p...
On the symbolic action point, you can try making the symbolic action into a public commitment. Research suggests this will increase the strength of the effect you're talking about. Of course, this could also make you overcommit, so this strategy should be used carefully.
"Out of curiosity, do people who grow up under this sort of regime end up thinking it's normal, similarly to the way people raised in Christianity end up desensitized to the absurd-sounding nature of the beliefs about virgin birth and so on? Does it cause them to e.g. be more accepting of government regulation than average?"
Why not look at relatively more secular western Europe versus the relatively more religious US and see which population is more accepting of government regulations. That is to say that either you have it precisely backwards or there is no observable correlation.
There's significant ambiguity about what counts as "changing" a belief. If you look at belief in the only way that's rational—that is, as coming in degrees—then you "change" your belief whenever you alter subjective probability. Your examples suggest that you're defining belief change as binary. I think people's subjective probabilities change all the time, but you rarely see a complete flip-flop, for good reason: significant beliefs often rest on vast evidence, which one new piece of evidence, no matter how striking, won't be apt to r
1) Specify a quitting point in advance.
Along this same line I try and always keep my beliefs and actions under the banner of a more general ideal or goal. For instance, if I wanted to help decrease existential risk and decided that the best way was to move to San Francisco to be closer to SIAI, then instead of simply caching the goal 'Move to SF' in my mind, I would try and cache 'Reduce existential risk by moving to SF'.
This takes extra memory, but it serves to remind you to question the validity of your subgoals in the context of your supergoals. I al...
Over-correct your opinion by reading propaganda
You could also try creating your own propaganda (also useful for Akrasia). You should have a good idea of the types of things that motivate you, so you can use that knowledge to make very focused adverts (e.g. basic posters) for yourself.
There's more on this kind of thing, advertising to yourself, over at http://www.takebackyourbrain.com/ - but it looks like it hasn't been updated in a while.
4) Admit that you're wrong to other people, whether its publicly or to close friends who are in a position to catch you not having updated your behavior. This adds social pressure to continue the change, and more people to notice when you mess up. (This could go under one or two, though.)
I don't have much of substance to add, but I want to say: this is an excellent post, and I think it deserves front page status.
Related to: Branches of Rationality, Rationality Workbook
Changing your behavior to match new evidence could be harder than simply updating your beliefs and then mustering your willpower, because (a) we are in denial about how often we change our minds, (b) cognitive dissonance is tolerable in the medium-term, and (c) the additional monitoring required to verify that your actions as well as your beliefs have changed makes it easier for you to pretend that your actions are suitable to your reality. It might help to (1) specify a quitting point in advance, (2) demonstrate your new opinion with symbolic action, or (3) activate your emotions by reading non-rational propaganda. Additional solutions are eagerly solicited.
Disclaimer:
This post contains examples drawn from politics and current events. I do not hope to change anyone's mind about any specific political belief, I know that Politics is the Mind-killer, I have tried to use non-inflammatory language, and I have a good faith belief that this post contains actual content on rationalism sufficient to justify its potentially controversial examples. Equally powerful but less controversial examples will be cheerfully substituted if anyone can bring them to my attention.
Review:
As has been amply discussed in the sequences, a key tool for overcoming the human tendency to irrationally defend prior beliefs simply because they are comfortable is to ask what, if anything, would cause you to abandon those beliefs. For example, in the “invisible dragon in the garage” parable, it quickly becomes clear to neutral observers that there is no potential evidence that could convince an invisible-dragon-fundamentalist that the dragon is fictional. If you test for breathing noises, it turns out that the dragon is inaudible. If you test for ecological impact, it turns out that the dragon lives off of invisible hamsters, etc. Thus we say that the belief in the dragon is unfalsifiable; there is no way to falsify your hypothesis that there is a dragon in your garage, and so your belief in the dragon does not pay rent in anticipated experiences.
There is a second human bias that causes you to cache an unrealistically high summary statistic for how often you change your mind: you think you change your mind, in general, pretty often, but unless you are an expert, highly-practiced rationalist, odds are that you do not. As evidence, try thinking of the last time you changed your mind about something and force yourself to specify what you believed beforehand and what you believed afterward. Me personally, I haven't changed my mind about anything that I can remember since about November 10th, 2010, and I'm sure I've expressed thousands of opinions since then. The odds are long.
The Problem:
There is a third human bias that causes you to tell yourself that you have successfully changed your mind when you have not really done so. The adherent of the Reformed Church of Dragon leaves the garage door open, and cheerfully admits to anyone who asks that there is probably no such thing as an invisible dragon, yet she is unaccountably cautious about actually parking her car in the garage. Thus it is worth knowing not just how to change your mind, but how to change your habits in response to new information. This is a distinct skill from simply knowing how to fight akrasia, i.e., how to muster the willpower to change your habits in general.
One example of this failure mode, recently reported by Slate.com, involves American troops in Iraq: there are at least some regions in Iraq where many people strongly prefer not to have American troops around, and yet American troops persist in residing and operating there. In one such region, according to a former American soldier who was there, the people greeted the incoming foreigners with a large, peaceful protest, politely asking the Yankees to go home. When the request was ignored, locals began attacking the Americans with snipers and roadside bombs. According to the ex-soldier, Josh Steiber, the Americans responded not by leaving the region, but by ordering troops to shoot whoever happened to be around when a bomb went off, as a sort of reprisal killing. At that point, cognitive dissonance finally kicked in for Josh, who had volunteered for the military out of a sense of idealism, and he changed his mind about whether he should be in Iraq: he stopped following orders, went home, and sought conscientious objector status.
The interesting thing is that his comrades didn't, even after seeing his example. The same troops in the same town confronted with the same evidence that their presence was unwelcome all continued to blame and kill the locals. One of Josh's commanders wound up coming around to Josh's point of view to the extent of being able to agree to disagree and give Josh a hug, but still kept ordering people to kill the locals. One wonders: what would it take to get the commander to change not just his mind, but his actions? What evidence would someone in his position have to observe before he would stop killing Iraqis? The theory is that American military presence in Iraq is good for Iraqis because it helps them build democracy, or security, or their economy, or some combination. It's moderately challenging to concede that the theory could be flawed. But, assuming you have the rationalist chops to admit your doubt, how do you go about changing your actions to reflect that doubt? The answer isn't to sit at home and do nothing; there are probably wars, or at the very least nonviolent humanitarian interventions, that are worth sending people abroad for (or going yourself, if you're not busy). But if you can't change your behavior once you arrive on the scene, your doubt is practically worthless -- we could replace you with an unthinking, unquestioning patriot and get the same results.
Another example was reported by Bill McKibben, author of Deep Economy, who says he happened to be in the organic farming region of Gorasin, Bangladesh the day an international food expert arrived to talk about genetically engineered "golden rice," which, unlike ordinary rice, is rich in Vitamin A and can prevent certain nutritional deficiency syndromes. "The villagers listened for a few minutes, and then they started muttering. Unlike most of us in the West who worried about eating genetically modified organisms, they weren't much concerned about 'frankenfood.' Instead, they instantly realized that the new rice would require fertilizer and pesticide, meaning both illness and debt. More to the point, they kept saying, they had no need of golden rice because the leafy vegetables they could now grow in their organic fields provided all the nutrition they needed. 'When we cook the green vegetables, we are aware not to throw out the water,' said one woman. 'Yes,' said another. 'And we don't like to eat rice only. It tastes better with green vegetables.'"
Bill doesn't say how the story ended, but one can imagine that there are many places like Gorasin where the villagers ended up with GMOs anyway. The November/December 2010 issue of Foreign Affairs has a pretty good article (partial paywall) about how international food donors have persisted in shipping grain -- sometimes right past huts full of soon-to-rot local stockpiles -- when what is really needed are roads, warehouses, and loans. One could argue that the continued funding of food aid at 100 times the ratio of food infrastructure aid, or the continued encouragement of miracle mono-crops in the face of local disinterest, simply reflects the financial incentives of large agricultural corporations. Considering how popular farmers are and how unpopular foreign aid is, though, there are doubtless easier ways for Monsanto and ConAgra to get their government subsidies. At least some of the political support for these initiatives has to come from well-intentioned leaders who have reason to know that their policies are counterproductive but who are unable or unwilling to change their behavior to reflect that knowledge.
It sounds stupid when people act this stubbornly on the global stage, but it is surprisingly difficult not to be stubborn. What if anything, would convince you to stop (or start) eating animals? Not merely to admit, verbally, that it is an acceptable thing for others to do, or even the moral or prudent thing for you to do, but to actually start trying to do it? What, if anything, would convince you to stop (or start) expecting monogamy in your romantic relationships? To save (or borrow) significant amounts of money? To drop one hobby and pick up another? To move across the country?
And, here's the real sore spot: how do you know? Suppose you said that you would save $1,000 a year if the real interest rate were above 15%. Would you really? What is your reference class for predicting your own behavior? Have you made a change like that before in your life? How did the strength of the evidence you thought it would take to change your behavior compare to the evidence it actually took to change your behavior? How often do you make comparably drastic changes? How often do you try to make such changes? Which are you more likely to remember -- the successful changes, or the failed and quickly aborted attempts?
Solutions:
Having just recently become explicitly aware of this problem, I'm hardly an expert on how to solve it. However, for whatever it is worth, here are some potential coping mechanisms. Additional solutions are strongly encouraged in the comments section!
1) Specify a quitting point in advance. If you know ahead of time what sort of evidence, E, would convince you that your conduct is counterproductive or strictly dominated by some other course of conduct, then switching to that other course of conduct when you observe that evidence will feel like part of your strategy. Instead of seeing yourself as adopting strategy A and then being forced to switch to strategy B because strategy A failed, you can see yourself as adopting the conditional strategy C, which calls for strategy A in circumstance E and for strategy B in circumstance ~E. That way your success is not dependent on your commitment, which should help reduce your commitment down toward an optimal level.
Without a pre-determined quitting point, you run the risk of making excuses for an endless series of marginal increases in the strength of the evidence required to make a change of action appropriate. Sunk costs may be an economic fallacy, but they're a psychological reality.
2) Demonstrate your new opinion with symbolic action. Have you decided to move to San Francisco, even though your parents and significant other swear they'll never visit you there? Great! We have nice weather here; look forward to seeing you as soon as you can land a job. Meanwhile, buy a great big map of our beautiful city and put it on your bedroom wall. The map, in and of itself, doesn't get you a damn bit closer to moving here. It doesn't purport to influence your incentives the way a commitment contract would. What it does do is help you internalize your conscious decision so the decision is more broadly endorsed by the various aspects of your psyche.
I remember at one point a religious camp counselor caught me using a glowstick on the Sabbath, and advised me to throw the glowstick away, on the theory that kindling a new light on the Sabbath violated the applicable religious laws. I asked him what good throwing away the light would do, seeing as it had already been kindled and would keep on burning its fixed supply of fuel no matter where I put it. He said that even though throwing away the light wouldn't stop the light from having been kindled (there were limits to his magical thinking, despite his religious convictions), it would highlight (har har) my agreement with the principle that kindling lights is wrong and make it easier not to do it again next time...the very sense that it is strange to throw away a lit glowstick helps put cognitive dissonance to work for changing your mind instead of against it: if you didn't strongly believe in the importance of not kindling glowsticks, why on earth would you have thrown it away? But you did throw it away, and so you must believe, and so on. Also, not reaping the benefits of the wrongly kindled light makes kindling lights seem to provide fewer benefits, and makes it easier to resist kindling it the next time -- if you know, in the moment of temptation, that even if you kindle the glowstick you might repent and not let yourself enjoy its light, you'll factor that into your utility function and be more likely to decide that the no-longer-certain future benefit of the light isn't worth the immediate guilt.
Anyway, this is a fairly weird example; I certainly don't care whether people light glowsticks, on a particular tribe's Sabbath or otherwise. I think it probably does help, though, to be a bit of a drama queen. If you buy a cake while you're dieting, don't just resolve not to eat it; physically throw it out the second-story balcony. If you've just admitted to yourself that your erstwhile political enemies actually have some pretty good points, write your favorite ex-evil candidate a letter of support or a $5 check and physically put it in the mail. As much as possible, bring your whole self into the process of changing your actions.
3) Over-correct your opinion by reading propaganda. Propaganda is dangerous when you read it in order to help you form an opinion, and a deontological evil when you publish it to hack into other peoples' minds (which, depending on circumstances and your philosophy, may or may not be justified by the good consequences that you expect will follow). But when you've already carefully considered the rational evidence for and against a proposition, and you feel like you've changed your mind, and yet you're still acting as if you hadn't changed your mind, propaganda might be just what you need. Read an essay that forcefully argues for a position even more extreme than the one you've just adopted, even if the essay is full of logical cul-de-sacs. In this limited context alone, gleefully give yourself temporary permission to ignore the fact that reading the essay makes you notice that you are confused. Bask in the rightness of the essay and the guilt/shame/foolishness/low-status that people who disagree with it should feel. If you gauge the dosage correctly, the propaganda might nudge your opinion just enough to make you actually adopt the new action that you felt would adequately reflect your new beliefs, but not enough to drive you over the cliff into madness.
As an example, I recently became convinced that eating industrially raised animals while living in San Francisco before the apocalypse can't ever be morally justified, and, yet, lo and behold, I still ate turkey sandwiches at Subway 5 times a week. Obviously I could have just used some of the tactics from the Akrasia Review to make eating less factory-meat a conscious goal...but I'm busy using those tools for other goals, and I think that there are probably at least some contexts in which willpower is limited, or at least a variable-sum game. So I read Peter Singer's book on Animal Liberation, and blamed all the woes of the world on steak for a few hours while slowly eating a particularly foul-tasting beef stew that was ruined by some Thai hole-in-the-wall, to reinforce the message. I'm doing a little bit better...I'm more likely to cross the street and eat vegetarian or pay for the free-range stuff, and I'm down to about 3 Subway footlongs a week, without any noticeable decrease in my willpower reserves.
Your mileage may vary; please use this tactic carefully.
4) Your suggestions.
Seriously; most of my point in posting here is to gather more suggestions. If I thought of the three best solutions in two hours with no training, I'll eat my shirt. And I will, too -- it'll help me repent.