Just thought you guys should know about this. Some work that argues that humans should not enhance their intelligence with technology, and that super intelligence probably never evolves.
Just thought you guys should know about this. Some work that argues that humans should not enhance their intelligence with technology, and that super intelligence probably never evolves.
Actual paper title from scientific journal: Why Aren't We Smarter Already: Evolutionary Trade-Offs and Cognitive Enhancements
Corresponding article headline: Human Brains Unlikely to Evolve Into a 'Supermind' as Price to Pay Would Be Too High
Actual paper title from scientific journal: Influence of Incubation Temperature on Morphology, Locomotor Performance, and Early Growth of Hatchling Wall Lizards (Podarcis muralis)
Projected future article headline: Killer 'Godzilla' Lizard Race Larger than Skyscrapers Unlikely to Arise because Global Warming Heats Eggs. All Forms of Genetic Engineering Therefore Impossible
Walking on land is probably impossible, Pre-Cambrian researchers announced, since even if we did evolve some sort of "legs" our gills would be unable to extract oxygen from the environment.
"Today we're at the beach, and yesterday we climbed the tallest mountain in the world," proclaimed the researchers over the phone during their celebratory Florida vacation. "Sugarloaf Mountain), 95 meters above sea level. No matter which direction by the compass you walk from the summit, it's down!"
The main argument appears to be that on average, higher intelligence implies a higher rate of mental disorders such as autism and Asperger's syndrome. I don't see how this relates to humans "making themselves smarter" - supposedly, if we have the technology to improve our brains, we'll of course also be able to get rid of the nasty side effects introduced by the alien god that's been improving it for us thus far.
It's also that if you take things that improve one side of mental performance it's likely to harm another. This isn't massively surprising to me: you'd expect that if upping a single hormone level or whatever would simply improve performance overall then evolution would have 'found' it. But presumably the same is true of giving performance-enhancing drugs to less intelligent animals - or, for that matter, giving people steroids etc. to increase their physical performance.
But just because drugs to make you run faster might lower your life expectancy, that doesn't mean our current running speed is the best evolution or technology can achieve. The problem is that any complex adaptation, like intelligence, is going to be a 'sweet spot' in the sense that a random massive change in a single factor will make it less succesful. That doesn't mean that evolution, or potentially much more sophisticated technological enhancement, can't improve matters.
Also, the 'something's going to get worse' principle only holds if what we consider bad is the same as what evolution selects against. It could in principle be true that humans became much more intelligent if they lost something that made them capable of defending themselves, reproducing, making allies or whatever. If our aims are different to what benefits our genes' survival, we may well be able to improve on nature: as we do with artificial sweetners, sex with condoms and other cunning tricks.
There's also the situation of "local maxima": It's possible (probable) that there are ways to make humans smarter through evolution, but the intermediate steps have poor results, causing a resistance to progress.
Diversity of a population plays a role too. If I'm well below Feynman level (and I am), then there's a possibility that I can slightly improve my cognitive abilities without any negative consequences.
My experience with nootropics (racetams) seems to support this, as far as it is possible for anecdotal evidence.
Of course, that assumes that autism should be considered a mental disorder. Many of those on the autism spectrum don't, whereas most of those with depression or high levels of anxiety do consider their condition to be a disorder. It looks a lot to me like status quo bias: if being more intelligent will cause our minds to become qualitatively different then we shouldn't try to be more intelligent.
Many of those on the autism spectrum don't, whereas most of those with depression or high levels of anxiety do consider their condition to be a disorder.
As a high-functioning autist; I would love for there to be a higher representation of fellow HFA's in the population. Our learning functions would still be different from the baseline population (as it currently exists) but... I feel the world would be a better place if HFAs represented as much as 10% of the population. Beyond that, I have uncertainty.
It seems to me that some of the "high-functioning" / "low-functioning" autism distinction has actually to do with the comorbidity of various other disorders and disabilities; as well as with the quality of schooling and other care. There seem to be a number of autistic folks whose lives are complicated by PTSD from bad psychiatric care, institutionalization, abusive schooling situations, etc. Presumably, if ASD were more common and better understood, these would be less likely.
Then again, defining disorders by self-reporting isn't that much more accurate than going with "any mental condition considered weird by the society".
"any mental condition considered weird by the society".
But that is how a lot of mental disorders are defined. See: attempts to medicalise non-heterosexuality.
It's not really interesting - from the summaries, it isn't adding anything new to Algernon's Law except perhaps some more detailed examples.
(None the less, I did request a copy. Might be useful.)
On the evolution of intelligence bit, he's probably right.
On the enhancement of intelligence, part, he's not... entirely wrong. Given a fixed energy budget (i.e. evolved environment), it seems reasonable that you can't improve the brain very much with gross chemical intervention (which is what he's talking about, in context). Of course, the energy budget isn't necessarily fixed, but it still is interesting.
Surely this depends on what you mean by 'improve the brain'. You might be able to make it better at things you consider important, by undermining things that the evolutionary environment deems important.
You could also trade off things that were more important in the ancestral environment than they are now. For example, social status (to which the neurotypical brain devotes much of its resources) is no longer the evolutionary advantage that it used to be.
You two realize you are just reinventing Bostrom's EOCs, right?
People, I wrote a thorough essay all about this! If I left something out, just tell me - you don't need to reinvent the wheel!
(This goes for half the comments on this page.)
Hmmm... the research smells a bit of status-quo bias, or? I always think of Richard Feynman when thinking about super-intelligent people who are not socially awkward -- if we could raise average intelligence to his level, that would improve this world, or?
Whether there's an evolutionary (ie., reproductive) advantage is not so clear - but humans are not limited by that.
Copy jailbreaked: http://www2.warwick.ac.uk/fac/sci/psych/people/academic/thills/thills/hillspublications/hillshertwig2011cdps.pdf
I've read it, and it doesn't cite Bostrom; as one would expect, this means it's pretty useless and a retread of Bostrom's paper. The main contribution of the paper, for me, is that it includes one or two useful examples I hadn't covered, and it includes some simple math models showing how U-shaped curves can fall out of optimizing for multiple properties.
EDIT: I emailed a link to Bostrom to the main author, who replied:
I think the arguments are actually quite similar, but from a slightly different perspective. We're both arguing that enhancement is possible, but that an understanding of the evolutionary and cognitive constraints is needed. We further add a bit on the kinds of domains where such trade-offs are most likely.
Seems stupid to me. If you look at human height, you see the problems 7 feet tall humans have. And yet we have the giraffe...
Actual paper title from scientific journal: Why Aren't We Smarter Already: Evolutionary Trade-Offs and Cognitive Enhancements
Corresponding article headline: Human Brains Unlikely to Evolve Into a 'Supermind' as Price to Pay Would Be Too High
Actual paper title from scientific journal: Influence of Incubation Temperature on Morphology, Locomotor Performance, and Early Growth of Hatchling Wall Lizards (Podarcis muralis)
Projected future article headline: Killer 'Godzilla' Lizard Race Larger than Skyscrapers Unlikely to Arise because Global Warming Heats Eggs. All Forms of Genetic Engineering Therefore Impossible