Nobel Prizes normally can't be issued posthumously. Since there is often a substantial time lapse between the discovery and the prize, it means that scientists who make prize worthy discoveries late in life may never receive the prize. This could produce a bias towards younger people receiving Nobels.
To take a recent Nobel in Chemistry, graphene, the research dates back to the '60s or so. Don't the big lags imply that only old people will receive Nobels? Seems so to me.
I think your wording is off. You must mean 'a bias towards people who were younger (at the time of the discovery) receiving Nobels'.
I've often observed that my ability to think creatively disappears after spending enough time having the noncreative normal way ground into me.
For example, I remember my first day on a job I noticed a bunch of ways the company was doing things inefficiently and could be better. After doing it the company's way for a year or two, the company's system seemed so natural that it didn't seem like there was anything wrong with it. But when I remembered some of the things I'd told people that first day, they still seemed like good ideas, even though I was no longer able to spontaneously generate them anymore.
Likewise, I think being in a field for a long time etches the paradigm into your brain so deeply that it inhibits your ability to think outside of it.
I think this probably works alongside any changes that might happen simply due to age. I'd like to see a study comparing the creativity of old people who are just joining a new field, versus relatively young people who have been in the field their whole lives.
A lot of people seem to be creative in less technical fields later in life after switching into them from a more technical field in which their creativity had somewhat dried up.
If you're talking about the creativity that works within a paradigm, then chronological age doesn't matter - it's only the amount of time that you've spent studying the field that matters. A person who enters a field at 50 shows a similar career trajectory than a person who enters it at 20.
If you're talking about paradigm-busting creativity, then I'm not aware of other studies that would have made the inside/outside-paradigm distinction. (Which isn't to say that they might not exist, of course.)
Why should we combat this at all? Why must everyone at all times be involved in developing paradigm-busting innovations? Why can't the task be delegated to those most capable of doing the task?
What if there's a paradigm-busting innovation that requires a lot of experience to see? Newton only had to invent first-year undergraduate stuff, as the saying goes.
Furthermore, our present society does very poorly at exploiting this resource of youth, so if we're going to explicitly rely on delegation to youth, we had better make more effort to stuff youth full of useful knowledge as fast as possible without wasting any time so they can start their useful research lifespans at age 18, not sentence it to slow university until age 30.
It seems to me that the limiting factor is usually the demand for such innovations, not their supply. There are usually lots of proposals for paradigm-busting innovations relative to the capacity to explore or test them. Until we can expand this capacity, it is hard to get very worked up over our not having an even larger supply of such proposals.
The result isn't that claimed / attempted paradigm-busting is limited to youth, but that successful paradigm-busting seems to start when young. I don't think we have an oversupply of paradigm-busters that are as good as they could get.
(If there's somewhere you go to get a reliable supply of these things, I have a large order to place with respect to certain areas of mathematical logic...)
blinks, stops I had never thought of it that way.
However, if we aren't only talking about doing science but also trying to become more rational, then it seems that the same process that made us less likely to come up with paradigm-busting innovations would also strenghten the effect of any biases we had. That includes biases not directly related to science, but also things distorting our evaluation of e.g. moral/ethical questions. While in science we can just let people specialize, in moral questions we'd like as many people as possible to be capable of thinking straight. Every single person who has a distorted view of ethical questions can do harm in their daily lives, while the amount of harm done by a single misguided scientist is smaller.
Also, the population structure in most countries is currently growing older, which implies that we may need to counteract the effect by making even older scientists paradigm-innovate more if we want to progress as fast as we've been doing so far.
Can't we think of the youthful lack of organizing mental structures as a bias that distorts their thoughts? Until we know the optimal point along this spectrum, we can't tell which side is biased on net.
Thinking new thoughts (as opposed to cached thoughts) is risky behavior, if e.g. it makes you a crank, but I don't think it can properly be called a bias.
Most thoughts are cached thoughts, or put together from other cached thoughts like Tinkertoys; most new ideas are heard from others rather than invented. Genuinely new thoughts are rare, even if they're less rare in the young than the old. To my mind their rarity increases their value: the ability to invent new thoughts is precious.
In writing fiction I've practiced techniques that reliably induce creativity: brainstorming, freewriting, random association, and so on. These are non-methodical in character; they're not processes you can use to produce a result, but processes that put you in a state that allows you to produce the result. They are basically irrational. Does that mean creativity is a failure mode of rationality, or are there techniques a rationalist can use to produce new thoughts?
Why should society want to combat this at all? Or why should each of us want to combat the decline in ourselves (to whatever extent we have such creative ability)?
Eliezer gave one possible answer for the first question above, but you might be right that delegation is better for society as a whole.
As for the second question though, individuals capable of paradigm-busting innovation will probably do almost anything to keep those creative powers, because instances of successful or partially successful application of those powers in the past have probably been high points of their life.
As for the second question though, individuals capable of paradigm-busting innovation will probably do almost anything to keep those creative powers
(No kidding.)
Well if you want to make immortality a reality and hence have fewer young people (unless we find ways to exist more cheaply or more resources), then if we want to carry on busting paradigms at a decent lick we need to solve this problem.
Old scientists are certainly capable of expanding and building on an existing paradigm, but they are very unlikely to revolutionize the whole paradigm.
I wonder how much of this is due to changes in personality and drive as life becomes a safe job, warm house, favorite chair and comfy slippers. With practice and purpose, could you keep an ability to revolutionize if you kept a real passion for it? From Daniel Dennett (parenthetical credited to Rodolfo Llinas):
The juvenile sea squirt wanders through the sea searching for a suitable rock or hunk of coral to cling to and make its home for life. For this task, it has a rudimentary nervous system. When it finds its spot and takes root, it doesn't need its brain anymore, so it eats it! (It's rather like getting tenure.)
Have you seen this paper, Heilman, Nadeau, Beversdorf. "Creative Innovation: Possible Brain Mechanisms" Neurocase (2003)?
There's a real kicker in the abstract:
"The observation that [creative innovation] occurs during levels of low arousal and that many people with depression are creative suggests that alterations of neurotransmitters such as norepinephrine might be important in [creative innovation]. High levels of norepinephrine, produced by high rates of locus coeruleus firing, restrict the breadth of concept representations and increase the signal to noise ratio, but low levels of norepinephrine shift the brain toward intrinsic neuronal activation with an increase in the size of distributed concept representations and co-activation across modular networks."
Speculative, of course. But we like speculative. Suggested exercise: close the curtains, put on some melancholy music, think grim thoughts, then have a go at a hard problem and see if it's any easier.
Edit: a hard problem requiring creativity, that is.
Ampakines are a new class of compounds known to enhance attention span and alertness, and facilitate learning and memory. ...
Unlike earlier stimulants (e.g. caffeine, methylphenidate (Ritalin), and the amphetamines), ampakines do not seem to have unpleasant, long-lasting side effects such as sleeplessness.
Apparently, only the military is interested in its mind-enhancing effects. Any chemists here interested in a start-up? ;-)
I bet neurofeedback could be used to help people distinguish between ideas originating with memory retrieval and ideas that they are constructing.
I also bet that some decline in creativity has to do with decreasing sleep with age. Older people sleep less. More productive people sleep less. More productive people are less creative.
"More productive people are less creative."
Is this fact?
I have no idea, but I think could be a lot more to it than that.
I can agree from my experience in the sense that the more I find myself working on tasks, the less time I have for "thinking," particularly more "out-of-the-box thinking."
But I don't know if this is true in general, or at least how many people this would apply to.
I also think it depends how you define "productivity." If you give more productivity weight to creative ideas, even though they be less tangible and more sparse, then the statement's not necessarily true.
Well, on the subject of neurofeedback, it might have some relevance to bring this study up: A theory of alpha/theta neurofeedback, creative performance enhancement, long distance functional connectivity and psychological integration.
We'd basically train our capacity to move in and out of the "daydreaming", hypnogogic state. What the scientists found here seems to say that this form of neurofeedback enhanced creative performance not only in the artistic spheres of activity like dance, piano and singing, but also technical spheres like science (I think that was just something they posited rather than tested, but I wouldn't be surprised). They attributed this partially to increased emotional stability and confidence and partially to the altered state of consciousness brought about by the modulation. Like all x-feedback training, after a while this phasing in/out would become second nature, and completely under our control, meaning that you wouldn't necessarily be sacrificing one modality of thought for another modality.
Someone said that they usually get their best insights just when they wake up, as the rest of their brain is lagging behind. I think that's pretty analogue to what's happening (see hypnogogia), and neurofeedback would help people get into this state more often, at will. Sweet deal, methinks.
At first I thought 'Oh no, I'm running out of creative brain time.' Then I thought maybe I could develop/find exercises to augment my creativity. Would I need to start using them now, or could I trust my future self to see the need for them?
But as it is I'm perhaps too good at the "random, unfiltered, and bizarre" for my own economic good.
My best insights have often come early in the morning when I've still been shaking the sleep out of my head, in the free-association state that accompanies awakening. The cognitive superstructure that assembles and deploys cached thoughts, I find, takes somewhat longer to wake up than the rest of my brain, making for periods of enhanced spontaneous creativity. In this state I often feel strong impulses to create artistically as well.
If I could find some drug or meditation technique that could reliably induce this state, I believe I would be a happier and more productive man.
Moderate amounts of alcohol can reliably induce such a state in me. I've been writing down alcohol-induced ideas for a while now, and mine are uniformly pretty good when I examine them the next day.
Related to: Spock's Dirty Little Secret, Does Blind Review Slow Down Science?
After finding out that old scientists don't actually resist change, I decided to do a literature search to find out if the related assumption was true. Is it mainly just the young scientists who are productive? (This should be very relevant for rationalists, since we and scientists in general have the same goal - to find the truth.)
The answer was a pretty resounding no. Study after study after study found that the most productive scientists were those in middle age, not youth. Productivity is better predicted by career age than chronological age. One study suggested that middle-aged scientists aren't more productive as such, but have access to better resources, and that the age-productivity connection disappears once supervisory position is controlled for. Another argued that it was the need for social networking that led the middle-aged to be the most productive. So age, by itself, doesn't seem to affect scientific productivity much, right?
Well, there is one exception. Dietrich and Srinivasan found that paradigm-busting discoveries come primarily from relatively young scientists. They looked at different Nobel Prize winners and finding out the age when the winners had first had the idea that led them to the discovery. In total, 60% of the discoveries were made by people aged below 35 and around 30% were made by people aged between 35 and 45. The data is strongest for theoretical physics, which shows that 90% of all theoretical contributions occurred before the age of 40 and that no theoretician over the age of 50 had ever had an idea that was deemed worthy of the Nobel prize. Old scientists are certainly capable of expanding and building on an existing paradigm, but they are very unlikely to revolutionize the whole paradigm. Why is this so?
Actually, this wasn't something that Dietrich just happened to randomly stumble on - he was testing a prediction stemming from an earlier hypothesis of his. In "the cognitive neuroscience of creativity", he presents a view of two kinds of systems for creativity: deliberate and spontaneous (actually four - deliberate/cognitive, deliberate/emotional, spontaneous/cognitive and spontaneous/emotional, but the cognitive-emotional difference doesn't seem relevant for our purposes). Summarizing the differences relevant to the aging/creativity question:
So, it seems like the older we get, the more likely it is that our thinking is dominated by pre-conceived ideas. This isn't automatically a bad thing, of course - those "pre-conceived ideas" are the ones we've been building for our whole lives. But it isn't good if that prevents us from coming up with entirely new yet good ideas. The empirical evidence seems to suggest it does.
What can we do to combat this? Different cognition-affecting drugs are one answer that automatically springs to mind, but many of those are for a large part both illegal and unsafe. Maybe we should try to spend more time daydreaming the older we get, or explicitly using our cognitive creativity to try to generate ideas which smack to us as senseless at first? But there are far more ideas that both seem and are senseless to us, than there are ideas which seem senseless and actually aren't, so the low hit ratio may be pretty exhausting.