In response to The Mistake Script
Comment author: PaulG 09 March 2009 06:36:49PM 1 point [-]

I don't think I agree with step 3 in the second script (step 4 in the third script). I think that would create a bias against understanding the intricacies of arguments that you agree with, which I'm not comfortable with. Maybe you could just restate it as "If you aren't sure that you agree with the statement, continue reading" or something to that effect.

Comment author: billswift 09 March 2009 01:30:47AM *  5 points [-]

I don't think it is so much magical thinking or superstition as it is that emotions tend to get aligned with beliefs, but it takes time. And to realign your emotions you need to "deprogram" the now-unconscious memories causing the fear. Just how often do you get the opportunity to face the fear of ghosts that you caught from ghost stories and horror movies when you were younger?

I watched too much Twilight Zone and Outer Limits and so on myself - despite being more at home in the dark than anyone else I have ever met, I still have a visceral feeling that something is about to jump out at me whenever I'm out at night.

Of course, this reinforces (or is reinforced by, depending on how you approach it) the evolutionary fear of the dark, mostly inspired by night hunting predators.

And this still leaves the problem of the popularity of horror movies.

Comment author: PaulG 09 March 2009 02:38:00AM 7 points [-]

I have to second the idea that it takes time to realign your emotions. I have overcome a number of irrational fears in my life and they don't usually go away as soon as I realize that they are irrational. For example, after I stopped believing in god, I still felt uncomfortable blaspheming. After I decided that it was OK to eat meat, it took me months before I actually was willing to eat any meat. And there are countless other situations where I decided, "This is a safe/acceptable activity", and yet I would still have a visceral uneasiness about doing them while acclimating to the idea.

Comment author: simpleton 08 March 2009 07:21:47PM 23 points [-]

There's a heuristic at work here which isn't completely unreasonable.

I buy $15 items on a daily basis. If I form a habit of ignoring a $5 savings on such purchases, I'll be wasting a significant fraction of my income. I buy $125 items rarely enough that I can give myself permission to splurge and avoid the drive across town.

The percentage does matter -- it's a proxy for the rate at which the savings add up.

It's also a proxy for the importance of the savings relative to other considerations, which are often proportional to the value of what you're buying. If you were about to sign the papers on a $20000 car purchase, would you walk away at the last minute if you found out that an identical car was available from another dealer for $19995? Would you try to explicitly weigh the $5 against intangibles such as your level of trust in the first dealer compared to the second, or would you be right to regard the $5 as a distraction and ignore it?

Comment author: PaulG 08 March 2009 09:39:55PM 2 points [-]

It seems to me like it shouldn't matter how often you buy the $15 items, technically. Even if you always bought $125 items and never bought $15 items, your heuristic still wouldn't be completely irrational. If you only buy $125 items, you'll only be able to buy 4% more stuff with your income, as compared to 33% more stuff if you always buy $15 items.

Comment author: Cameron_Taylor 08 March 2009 12:00:29PM *  1 point [-]

Concussions are a clear all round negative to brain function. The drugs in question appear to provide some clearly demonstrated benefits. I do not agree that Annoyances analogy is appropriate.

Comment author: PaulG 08 March 2009 08:59:55PM 3 points [-]

I still think it's the same basic framework. "Benefits" is a highly subjective term. I think you are still making the same essential decision - is it worth risk X for new experience Y. I agree with you in the sense that I think very few people would actually decide to take a concussion just to experience an altered brain, but that doesn't mean it's not the same type of decision.

And to be fair to your point, although I think his analogy is apt, it is rhetorically misleading in that the implication is that you wouldn't want the concussions so you shouldn't want the drugs. In fact, I think that the asymmetry between his analogy and the drugs analogy serve to best demonstrate that the question isn't black-and-white and that it would be hasty to jump to the conclusion that everyone interested in brain function should rationally take mushrooms.

Comment author: mtraven 07 March 2009 09:46:49PM 1 point [-]

No.

The analogy with a trip to India is not a bad one. You can read all you like about India, but it won't be the same as actually going to Mumbai and experiencing it first-hand. Presumably nobody would claim to be an expert on India without visiting it, seeing as it isn't that hard, and while it is not without risks the experience is worth it.

Comment author: PaulG 08 March 2009 03:05:05AM 2 points [-]

I disagree here. I think that Annoyance's analogy was apt in that it is the same sort of decision, but with a different cost/benefit analysis. Clearly in both cases (and in the India case) you "should" take the action (get a concussion, take some drugs) if you think that the cost of taking the action is less than the potential benefit.

I do agree with you, however, in the sense that I imagine that most people consider the net benefit of taking drugs at least once to be more in line with a trip to India than with a damaged brain.

Comment author: PaulG 08 March 2009 12:53:37AM 4 points [-]

I wonder if there's some selection bias inherent in the studies presented here. Assuming that it has been established that older scientists are more willing to accept new controversial hypotheses than younger scientists, has it also been established that they differentially accept good new controversial hypotheses? What I see here is that they tended to embrace the big paradigm shifts relatively early, but it doesn't say anything about older scientists' tendencies to embrace controversial hypotheses that ended up later being discredited. Specifically, Linus Pauling's obsession with Vitamin C megadosing later in life springs to mind.

Comment author: Eliezer_Yudkowsky 07 March 2009 11:51:29PM 7 points [-]

Excellent post - it makes me wish that the system gave out a limited number of super-votes, like 1 for every 20 karma, so that I could vote this up twice.

I hope you don't mind, but I did a quick edit to insert "a choice of" before "two drugs to test", because that wasn't clear on my first reading. (Feel free to revert if you prefer your original wording.) Also edited the self-deception tag to self_deception per previous standard.

Comment author: PaulG 08 March 2009 12:32:03AM *  3 points [-]

The idea of super-votes sounds similar to the system they have at everything2, where users are awarded a certain number of "upvotes" and a certain number of "cools" every day, depending on their level. An upvote/downvote adds/subtracts 1 point to their equivalent of karma for the post and a Cool gives the player a certain number of points, is displayed as "Cooled" on the post and is promoted to their main page.

(I reposted this as a reply because I was unfamiliar with the posting system when I first wrote it.)

View more: Prev