gwern, looks like you haven't been understanding a particular point.
The article reveals that reward is why the teenagers underestimate risk. The article reveals that teens perception of reward motivates their impulsiveness.
Incentives matter, which is what I've been saying all along.
Indeed, that's a point I agree with and have right from my very first rebut in this discussion. The incentives provide motivation for underestimating risk.
Wow. So your explanation for a clear-cut reward link is... they get distracted and can't estimate risk as accurately.
That's one way of summarising what the article is proposing.
The article reveals that reward is why the teenagers underestimate risk. The article reveals that teens perception of reward motivates their impulsiveness.
No. The value of a decision is gain minus cost; if the cost remains the same but the gain increases, then that can swing the value of a decision from negative to positive. Thus, they can be more impulsive while maintaining the same beliefs about risk.
When I first read the words above—on August 1st, 2003, at around 3 o’clock in the afternoon—it changed the way I thought. I realized that once I could guess what my answer would be—once I could assign a higher probability to deciding one way than other—then I had, in all probability, already decided. We change our minds less often than we think. And most of the time we become able to guess what our answer will be within half a second of hearing the question.
How swiftly that unnoticed moment passes, when we can’t yet guess what our answer will be; the tiny window of opportunity for intelligence to act. In questions of choice, as in questions of fact.
The principle of the bottom line is that only the actual causes of your beliefs determine your effectiveness as a rationalist. Once your belief is fixed, no amount of argument will alter the truth-value; once your decision is fixed, no amount of argument will alter the consequences.
You might think that you could arrive at a belief, or a decision, by non-rational means, and then try to justify it, and if you found you couldn’t justify it, reject it.
But we change our minds less often—much less often—than we think.
I’m sure that you can think of at least one occasion in your life when you’ve changed your mind. We all can. How about all the occasions in your life when you didn’t change your mind? Are they as available, in your heuristic estimate of your competence?
Between hindsight bias, fake causality, positive bias, anchoring/priming, et cetera, et cetera, and above all the dreaded confirmation bias, once an idea gets into your head, it’s probably going to stay there.
1Dale Griffin and Amos Tversky, “The Weighing of Evidence and the Determinants of Confidence,” Cognitive Psychology 24, no. 3 (1992): 411–435.