I think the most important argument is missing. If you repeat an affirmation every night or just think about it, what happens in your brain is that corresponding brain areas get activated(see also http://www.overcomingbias.com/2007/10/priming-and-con.html). Once you do this your brain will automatically keep thinking about it and generate ideas to advance towards your goal. You will keep an open eye for good opportunities and will take advantage of them.
You have to contrast this with people who are just going with the flow and don't have a fixed target on mind.
I agree with this. The primary reason I would try affirmation is simply to keep myself focused on what I consider to be my long-term goals. Of course, I could also do this by repeating this set of questions every night:
We can think of this as self-signaling. Earnestly repeating an aspiration many times sends a credible signal to your subconscious that this goal is very important to you, and so worth devoting more resources.
Elsewhere I write
If you practise Buddhism, doing traditional meditations and, separately, thinking about what you learn from them, the thing that stands out is the craptastic way that thoughts swirl round and round inside your head. Before you become a Buddhist you think it is easy to dodge developing bad habits, because you need only avoid doing things too often. Afterwards you realise that your life is shaped by your mind and you can screw yourself over by developing a bad habit with very few physical repetitions by repeatedly thinking thoughts
If you find this hypothesis worthy of consideration it greatly complicates the question of the effectiveness of affirmations. You have repeated thoughts that you haven't chosen and aren't much aware of. Your affirmation replaces some of these but not others. If you try to analyse the effect of an affirmation purely in terms of it own positive content, without considering the negative thought it displace, things could get very confusing.
When Scott Adams says to himself "I will become a syndicated cartoonist" is that instead of saying "I will become a doctor and cure cancer" or instead of saying "Only fine artists can become cartoonists."
When Scott Adams says to himself "I will become a syndicated cartoonist" is that instead of saying "I will become a doctor and cure cancer" or instead of saying "Only fine artists can become cartoonists."
I'm sorry - what?
Translation:
Scenario 1: "I will become a syndicated cartoonist" replaces the thought "I will become a doctor and cure cancer"
Scenario 2: "I will become a syndicated cartoonist" replaces the thought "Only fine artists can become cartoonists"
Which of these is occurring?
A few additions: Adams points out that it's possible that affirmations work via selection bias: if you're capable of actually having the self-discipline to use them consistently, it may simply mean you also have the self-discipline to achieve the goal involved.
Other factors that play into affirmations have to do with whether you consciously reject the belief in the affirmations; many books mention the importance of noting your automatic responses (like "yeah, right!") to an affirmation and addressing whatever conflicting belief is involved. If not, the process can actually reinforce one's conflicting beliefs, due to retrieval practice! (This would explain, btw, a lot of individual variation in success at affirmation usage.)
Finally:
He suggests using imagination and self-deception to trick the subconscious mind into adopting the necessary role.
Actors are not practicing self-deception, any more than a child is practicing self-deception when he or she says, "I'm a fireman!" Congruent role play in both cases consists merely of adopting the role without signaling pretense!
Notice that children have to be taught to say, "I'm pretending to be a fireman", and that only "ham" actors attempt to draw attention to their acting. Unskilled actors are likewise too conscious of the fact they are "acting", and thus "act", instead of simply assuming the role.
Another way of putting it might be that "acting as if" (of which affirmations are actually just one manifestation) is like crossing a girder between skyscrapers -- you can do it confidently as long as you can avoid thinking too much about what you're doing. Children do this easily, because they haven't learned to second-guess themselves yet. Adults have it a little bit harder, but they can still learn to ignore their second guesses or find something that draws their focus in so they can't pay attention to the second-guessing.
Notice, too, that placebos, affirmations, and priming are all able to work as long as there is no conflicting information present to cause active disbelief or rejection of the suggestion. This strongly suggests that they are all operating via the exact same mechanism, despite our differing names for the phenomena.
I feel obligated to point out that Boltzman brains aren't a hypothesis themselves, but a counterargument to the hypothesis that the current low-entropy state of the universe is a random low-entropy fluctuation in a steady state high-entropy universe.
The idea is that a brain randomly forming out a maxentropy soup is a lot more likely than enough negentropy to comprise our entire universe, so if you believe in the steady state hypothesis, then you have to conclude that you are far more likely to be a bloltzman brain, than not.
"So I was very surprised to find Adams was a believer in and evangelist of something that sounded a lot like pseudoscience."
Yep. The Dilbert Future isn't online so you can't see the nonsense directly, but to get a feeling for what Adams was like before he started backpedaling recently:
(http://www.reall.org/newsletter/v05/n12/scott-adams-responds.html)
Unflattering, but (to memory) accurate, description of The Dilbert Future here:
Glad someone mentioned that there is good reason Scott Adams is not considered a paradigm rationalist.
To memory, that description of The Dilbert Future sounds accurate, but I think it misses the fact that the book was not meant to be taken seriously. Given the extent to which Dilbert relies on absurdity, I do not find it particularly likely that an intelligent and relatively skeptical person like Scott Adams meant for it to be taken as truth.
I would rank explanations for what he said in the following order of likeliness:
It's a joke
He meant for readers to ponder absurd ideas for the sake of mind-expansion,
He actually believes it.
I happen to have a copy of The Dilbert Future. You're right that Scott Adams writes mainly for comedy. However, the end section of The Dilbert Future is more serious. Adams actually writes, "I'm turning the humor mode off for this chapter because what you're going to read is so strange that you'd be waiting for the punch line instead of following the point." And without re-reading the whole thing, as i recall his tone is about as serious as he promises. The serious chapter includes some quantum physics speculation, but the main idea Adams advocates is affirmations), which he ties into part of his life story.
A recent study on affirmations:
"Self-Affirmation Can Break Cycle of Negative Thoughts"
http://www.aaas.org/news/releases/2009/0416sp_affirmation.shtml
I'm not sure why anyone would call affirmations "pseudoscientific" - that sounds insulting - and affirmations deserve better treatment.
The pseudo-scientific part is where some people believe affirmations can have magical/reverse-causality type effects rather than simply being a motivational/subconcious thing. Adams advocated using affirmations for these types of effects in The Dilbert Future, though I don't know if he still does.
The difficulty for me is that this technique is at war with having an accurate self-concept, and may conflict with good epistemic hygiene generally. For the program to work, one must seemingly learn to suppress one's critical faculties for selected cases of wishful thinking. This runs against trying to be just the right amount critical when faced with propositions in general. How can someone who is just the right amount critical affirm things that are probably not true?
The difficulty for me is that this technique is at war with having an accurate self-concept, and may conflict with good epistemic hygiene generally.
Generally, I see no conflict here, assuming that the thing you're priming yourself with is not something that might displace your core rationalist foundations.
If you're riding a horse, it is epistemically rational to incorporate the knowledge about the horse into your model of the world (to be aware how it will react to a pack of wolves or an attractive mare during a mating season), and it is instrumentally rational to be able to steer the horse where you want it to carry you.
Same with your mind -- if you're riding an evolutionary kludge, it is epistemically rational to incorporate the knowledge about the kludge it into your map of reality, and it is instrumentally rational to be able to steer it where you want it to be.
What matters is where you draw the line between the agent and the environment.
The difficulty for me is that this technique is at war with having an accurate self-concept, and may conflict with good epistemic hygiene generally.
Is an actor practicing poor epistemic hygiene when they play a role?
Refraining from dispute is not the same thing as believing. Not discussing religion with your theist friends is not the same as becoming one yourself.
On an off-note, Adams has also suggested exercise and diet as simple and yet important components of beating Akrasia. For this specific goal, I think they are more important than affirmations.
For overall performance though, I'm not so sure.
I would ask whether affirmations result in any change to the details of the person's behavior with regard to the goal. (E.g. does it result in just one more resume submission?) It seems likely to me.
Then, remember that this question is usually posed in the context of some economic game like real estate investing, dating, job searches, career advancement, etc. The principle of each person acting in their own best interest makes such games tend towards a certain (imperfect) balance (e.g. your rent is probably on the same order of magnitude as what you can afford for rent).
Balances are sensitive to tipping. Any behavior (trying harder, not giving up as easily, making one more call, walking a little faster, putting in an extra hour) that tips the balance in your favor, is probably doing yourself a favor. Affirmations seem to me just a way to prime yourself for such behavior.
But AFAIK no one's ever done a study on Adams-variety simple personal affirmations in all of their counter-intuitive weirdness, probably because it sounds so silly, and I think that's a shame.
I totally agree with this. Do you have any ideas on how to turn this into a controlled experiment?
The simple form would be to get a group of people, have them state a major (minor?) life goal, assign them randomly to two groups, one of which does affirmations, and one of which does not. Time period and measure of success would depend on what exactly your theory was, but say 1 year and count all quantifiable steps in the target direction through a self-reporting format, with experimenter followup. (Alternatively, if you have funds for it, keep the experiment running until a given percentage reach their stated goal, and compare success rates; however, multiyear studies are much more expensive and difficult -- and have more participant dropouts).
While I'm sympathetic to the idea that "you are likely to begin believing things you say to yourself frequently, and furthermore you are likely to act in ways to make these beliefs come true," I'm not sure I'd use Scott Adams as the prototypical purveyor of this idea. He is known to have beliefs difficult to characterize as "rational," notably a belief in intelligent design creationism. The Wikipedia also cites him as being a hypnotist and a vegan; these are not inherently irrational, but I don't have further sources to analyze Adam's specific perspective on these issues.
The intelligent design issue is complex, but he's said outright that he doesn't believe in it. I think his position is something like "Most people who believe evolution are not smart enough to understand it, and would be better off believing intelligent design since it makes more sense on a naive level. Most believers in evolution who are not biologists are making the 'science as belief-attire' type mistake." It's been a while since I read about that particular flame war, so I might be mistaken, but I do remember he specifically said that with extremely high probability ID was wrong.
Hypnotism has been shown to work in studies by the AMA, BMJ, and every other group of medical experts who have investigated the question, and he's a vegetarian, not a vegan - and so am I, so you're going to have trouble convincing me that's a strike against him. Though if you want to write a post about it, I'd be interested in hearing your arguments against.
As far as I can remember, Scott recognizes the overwhelming evidence that supports evolution, but thinks that our current understanding of how it acts is incomplete. I think he actually made a wager that in the next few years, we'd discover new evidence that rewrites a lot of what we know about evolution - but don't quote me on that.
His reasoning is that evolution currently sets off his 'bullshit meter', another method of using his subconscious he has. It doesn't sound like the right explanation to him.
Of course, Scott Adams is about as qualified to discuss the merits of evolution as my cat, so I'm not sure I'd rely on his subconscious for guidance on this issue.
His vegetarianism, as far as I know, is simply because he gets terrible stomachaches whenever he eats meat.
I wish this was separated into two comments, since I wanted to downvote the first paragraph, and upvote the second.
Some of the smartest and most epistemically rational people I know are vegan. They simply do not want to support what they consider the unnecessary cruelty to animals involved in modern food production.
I think the title is misleading because this is not about convincing others but changing yourself. A better idea would be "What I tell myself every night becomes true" or something entirely different.
Well, (a) it's an excellent literary reference, and (b) let "I" be the conscious mind and "you" be the subconscious, and there you are.
I'm familiar with his ideas about affirmations. I actually read about it in a book of his a long time ago and tried it out. I was struggling in math class in high school and used the affirmations to improve my grades. I actually got higher grades soon thereafter. Adams believes that the affirmations work because of "Chaos Theory" which is of course nonsense. I think a lot of gimmicks like affirmations actually work but that it's mostly due to the placebo effect.
"The human brain evidently operates on some variation of the famous principle enunciated in 'The Hunting of the Snark': 'What I tell you three times is true.'"
-- Norbert Weiner, from Cybernetics
Ask for a high-profile rationalist, and you'll hear about Richard Dawkins or James Randi or maybe Peter Thiel. Not a lot of people would immediately name Scott Adams, creator of Dilbert. But as readers of his blog know, he's got a deep interest in rationality, and sometimes it shows up in his comics: for example, this one from last week. How many people can expose several million people to the phrase "Boltzmann brain hypothesis" and have them enjoy it?
So I was very surprised to find Adams was a believer in and evangelist of something that sounded a lot like pseudoscience. "Affirmations" are positive statements made with the belief that saying the statement loud enough and long enough will help it come true. For example, you might say "I will become a syndicated cartoonist" fifteen times before bed every night, thinking that this will in fact make you a syndicated cartoonist. Adams partially credits his success as a cartoonist to doing exactly this.
He admits "it sounds as if I believe in some sort of voodoo or magic", and acknowledges that "skeptics have suggested, and reasonably so, that this is a classic case of selective memory" but still swears that it works. He also has "received thousands of e-mails from people recounting their own experiences with affirmations. Most people seem to be amazed at how well they worked."
None of this should be taken too seriously without a controlled scientific study investigating it, of course. But is it worth the effort of a study, or should it be filed under "so stupid that it's not worth anyone's time to investigate further"?
I think there's a good case to be made from within a rationalist/scientific worldview that affirmations may in fact be effective for certain goals. Not miraculously effective, but not totally useless either.
To build this case, I want to provide evidence for two propositions. First, that whether we subconsciously believe we can succeed affects whether or not we succeed. Second, that repeating a statement verbally can make the subconscious believe it.
The link between belief in success and success has progressed beyond the motivational speaker stage and into the scientific evidence stage. The best-known of these links is the placebo effect. For certain diseases, believing that you will get better does increase your probability of getting better. This works not only subjectively (ie you feel less pain) but objectively (ie ulcers heal more quickly, inflammation decreases).
The placebo effect applies in some stranger cases outside simple curative drugs. A placebo stop-smoking pill does increase your chance of successfully quitting tobacco. Placebo strength pills enable you to run faster and lift more weight. Placebo alcohol makes you more gregarious and less inhibited1. Placebo therapies for phobia can desensitize you to otherwise terrifying stimuli.
There are some great studies about the effect of belief in school settings. Pick a student at random and tell the teacher that she's especially smart, and by the end of the year she will be doing exceptionally well; tell the teacher that she is exceptionally stupid, and by the end of the year she'll be doing exceptionally poorly. The experimenters theorized that the teacher's belief about the student's intelligence was subconsciously detected by the student, and that the student was somehow adjusted her performance to fit that belief. In a similar study, minority students were found to do worse on tests when reminded of stereotypes that minorities are stupid, and better when tested in contexts that downplayed their minority status, suggesting that the students' belief that they would fail was enough to make them fail.
Belief can also translate to success when mediated by signals of dominance and confidence. We've already discussed how hard-to-fake signals of confidence can help someone pick up women2. Although I don't know of any studies proving that confidence/dominance signals help a businessperson get promoted or a politician get elected, common sense suggests they do. For example, height does have a proven effect in this area, suggesting that our ancestral algorithms for assessing dominance play a major role.
MBlume has already discussed how one cannot simply choose to consciously project dominance signals. The expressions and postures involved are too complicated and too far from the normal domain of conscious control. He suggests using imagination and self-deception to trick the subconscious mind into adopting the necessary role.
So I hope it is not too controversial when I say that subconscious beliefs can significantly affect disease, willpower, physical strength, intelligence, and romantic and financial success.
The second part of my case is that repeating something makes the brain believe it on a subconscious level.
Say Anna Salamon and Steve Rayhawk: "Any random thing you say or do in the absence of obvious outside pressure, can hijack your self-concept for the medium- to long-term future." That's from their excellent post Cached Selves, where they explain that once you say something, even if you don't really mean it, it affects all your beliefs and behaviors afterwards. If you haven't read it yet, read it now: it is one of Less Wrong's growing number of classics.
There's also this study which someone linked me to on Overcoming Bias and to which I keep returning. It demonstrates pretty clearly that we don't have a lot of access to our own beliefs, and tend to make them up based on our behavior. So if I am repeating "I will become a syndicated cartoonist", and my subconscious is not subtle enough to realize I am doing it as part of a complex plot, it might very well assume I am doing it because, well, I think I will become a syndicated cartoonist. And the subconscious quite likes to keep beliefs consistent, so once it "discovers" I have that belief, it may edit whatever it needs to edit to become more consistent with it.
There have been a few studies vaguely related to affirmations. One that came out just a few weeks ago found that minorities who wrote 'value affirmation' essays did significantly better in school (although the same effect did not apply to Caucasians) . Another found that some similar sort of 'value affirmation' decreased stress as measured in cortisol and other physiological measures . But AFAIK no one's ever done a study on Adams-variety simple personal affirmations in all of their counter-intuitive weirdness, probably because it sounds so silly, and I think that's a shame. If they works, it's a useful self-help technique and akrasia-buster. If they don't work, that blocks off a few theories about how the mind works and helps us start looking for alternatives.
Footnote
1: A story I like: in one of the studies that discovered the placebo effect for alcohol, one of the male participants who falsely believed he was drunk tried to cop a feel of a female researcher's breasts. That must have been the most awkward debriefing ever.
2: Here I'm not just making my usual mistake and being accidentally sexist; I really mean "pick up women". There is less research suggesting the same thing works on men. See Chapter 6 of The Adapted Mind, "The Evolution of Sexual Attraction: Evaluative Mechanisms in Women".