instead I mean that when the risk is death of oneself or others then the risk is so high as to outweigh most, if not all rewards.
And how does one judge when the reward is outweighed? Hm, that wouldn't be subjective would it...?
If one gets better at balancing those things during development, ie. growing up, that reveals one has a lack in balancing those things, which are to do with judgement - and poor judgement is one form of overconfidence.
Never reason from a price change. Are teenagers overconfident about how much daytime sleep they need, and their circadian rhythm shifts due to them getting "better at balancing impulse, desire, goals, self-interest, rules, ethics, and even altruism, generating behavior that is more complex and, sometimes at least, more sensible"?
Dobbs continues "If offered an extra reward, however, teens showed they could push those executive regions to work harder, improving their scores."
This draws out a reason why they underestimate risk - because if the reward isn't "extra" or perceived as higher to the teen, then they likely won't push those executive regions to work harder, as is revealed in the articles previous paragraph, which I haven't quoted.
This practically demonstrates the opposite: that the reward is the important part! What rewards do experimenters usually offer? Pretty lousy ones. Why is it surprising that teens might not work as hard as conscientious saps - I mean, mature adults. (I am reminded of how money rewards can improve IQ test performance by half a standard deviation or so.) This is like the usual criticism of the PISA test scores: of course Americans will underperform, since not a single one of the test-takers cares about what score they get. Incentives matter, which is what I've been saying all along.
This can actually be interpreted either way, Steinberg chooses a higher reward, whilst one can just as reasonably choose an underestimation of risk – a rationale for which is alluded to in article is that social feelings/thoughts are more sensitive for the teenagers, hence the brain is more focussed on social cognition than risk estimation (as the risk estimation processes require more effort), hence less brain cycles on risk estimation – hence risk underestimation.
Wow. So your explanation for a clear-cut reward link is... they get distracted and can't estimate risk as accurately.
When I first read the words above—on August 1st, 2003, at around 3 o’clock in the afternoon—it changed the way I thought. I realized that once I could guess what my answer would be—once I could assign a higher probability to deciding one way than other—then I had, in all probability, already decided. We change our minds less often than we think. And most of the time we become able to guess what our answer will be within half a second of hearing the question.
How swiftly that unnoticed moment passes, when we can’t yet guess what our answer will be; the tiny window of opportunity for intelligence to act. In questions of choice, as in questions of fact.
The principle of the bottom line is that only the actual causes of your beliefs determine your effectiveness as a rationalist. Once your belief is fixed, no amount of argument will alter the truth-value; once your decision is fixed, no amount of argument will alter the consequences.
You might think that you could arrive at a belief, or a decision, by non-rational means, and then try to justify it, and if you found you couldn’t justify it, reject it.
But we change our minds less often—much less often—than we think.
I’m sure that you can think of at least one occasion in your life when you’ve changed your mind. We all can. How about all the occasions in your life when you didn’t change your mind? Are they as available, in your heuristic estimate of your competence?
Between hindsight bias, fake causality, positive bias, anchoring/priming, et cetera, et cetera, and above all the dreaded confirmation bias, once an idea gets into your head, it’s probably going to stay there.
1Dale Griffin and Amos Tversky, “The Weighing of Evidence and the Determinants of Confidence,” Cognitive Psychology 24, no. 3 (1992): 411–435.