Less Wrong is a community blog devoted to refining the art of human rationality. Please visit our About page for more information.

Selecting Rationalist Groups

35 Eliezer_Yudkowsky 02 April 2009 04:21PM

Previously in seriesPurchase Fuzzies and Utilons Separately
Followup toConjuring an Evolution To Serve You

GreyThumb.blog offered an interesting comparison of poor animal breeding practices and the fall of Enron, which I previously posted on in some detail.  The essential theme was that individual selection on chickens for the chicken in each generation who laid the most eggs, produced highly competitive chickens—the most dominant chickens that pecked their way to the top of the pecking order at the expense of other chickens.  The chickens subjected to this individual selection for egg-laying prowess needed their beaks clipped, or housing in individual cages, or they would peck each other to death.

Which is to say: individual selection is selecting on the wrong criterion, because what the farmer actually wants is high egg production from groups of chickens.

While group selection is nearly impossible in ordinary biology, it is easy to impose in the laboratory: and breeding the best groups, rather than the best individuals, increased average days of hen survival from 160 to 348, and egg mass per bird from 5.3 to 13.3 kg.

The analogy being to the way that Enron evaluated its employees every year, fired the bottom 10%, and gave the top individual performers huge raises and bonuses.  Jeff Skilling fancied himself as exploiting the wondrous power of evolution, it seems.

If you look over my accumulated essays, you will observe that the art contained therein is almost entirely individual in nature... for around the same reason that it all focuses on confronting impossibly tricky questions:  That's what I was doing when I thought up all this stuff, and for the most part I worked in solitude.  But this is not inherent in the Art, not reflective of what a true martial art of rationality would be like if many people had contributed to its development along many facets.

Case in point:  At the recent LW / OB meetup, we played Paranoid Debating, a game that tests group rationality.  As is only appropriate, this game was not the invention of any single person, but was collectively thought up in a series of suggestions by Nick Bostrom, Black Belt Bayesian, Tom McCabe, and steven0461.

continue reading »

The Baby-Eating Aliens (1/8)

42 Eliezer_Yudkowsky 30 January 2009 12:07PM

(Part 1 of 8 in "Three Worlds Collide")

This is a story of an impossible outcome, where AI never worked, molecular nanotechnology never worked, biotechnology only sort-of worked; and yet somehow humanity not only survived, but discovered a way to travel Faster-Than-Light:  The past's Future.

Ships travel through the Alderson starlines, wormholes that appear near stars.  The starline network is dense and unpredictable: more than a billion starlines lead away from Sol, but every world explored is so far away as to be outside the range of Earth's telescopes.  Most colony worlds are located only a single jump away from Earth, which remains the center of the human universe.

From the colony system Huygens, the crew of the Giant Science Vessel Impossible Possible World have set out to investigate a starline that flared up with an unprecedented flux of Alderson force before subsiding.  Arriving, the Impossible discovers the sparkling debris of a recent nova - and -

"ALIENS!"

Every head swung toward the Sensory console.  But after that one cryptic outburst, the Lady Sensory didn't even look up from her console: her fingers were frantically twitching commands.

There was a strange moment of silence in the Command Conference while every listener thought the same two thoughts in rapid succession:

Is she nuts?  You can't just say "Aliens!", leave it at that, and expect everyone to believe you.  Extraordinary claims require extraordinary evidence -

And then,

They came to look at the nova too!

continue reading »

Contaminated by Optimism

10 Eliezer_Yudkowsky 06 August 2008 12:26AM

Followup to: Anthropomorphic Optimism, The Hidden Complexity of Wishes

Yesterday, I reprised in further detail The Tragedy of Group Selectionism, in which early biologists believed that predators would voluntarily restrain their breeding to avoid exhausting the prey population; the given excuse was "group selection".  Not only does it turn out to be nearly impossible for group selection to overcome a countervailing individual advantage; but when these nigh-impossible conditions were created in the laboratory - group selection for low-population groups - the actual result was not restraint in breeding, but, of course, cannibalism, especially of immature females.

I've made even sillier mistakes, by the way - though about AI, not evolutionary biology.  And the thing that strikes me, looking over these cases of anthropomorphism, is the extent to which you are screwed as soon as you let anthropomorphism suggest ideas to examine.

In large hypothesis spaces, the vast majority of the cognitive labor goes into noticing the true hypothesis.  By the time you have enough evidence to consider the correct theory as one of just a few plausible alternatives - to represent the correct theory in your mind - you're practically done.  Of this I have spoken several times before.

And by the same token, my experience suggests that as soon as you let anthropomorphism promote a hypothesis to your attention, so that you start wondering if that particular hypothesis might be true, you've already committed most of the mistake.

continue reading »

Anthropomorphic Optimism

26 Eliezer_Yudkowsky 04 August 2008 08:17PM

Followup toHumans in Funny Suits, The Tragedy of Group Selectionism

The core fallacy of anthropomorphism is expecting something to be predicted by the black box of your brain, when its casual structure is so different from that of a human brain, as to give you no license to expect any such thing.

The Tragedy of Group Selectionism (as previously covered in the evolution sequence) was a rather extreme error by a group of early (pre-1966) biologists, including Wynne-Edwards, Allee, and Brereton among others, who believed that predators would voluntarily restrain their breeding to avoid overpopulating their habitat and exhausting the prey population.

The proffered theory was that if there were multiple, geographically separated groups of e.g. foxes, then groups of foxes that best restrained their breeding, would send out colonists to replace crashed populations.  And so, over time, group selection would promote restrained-breeding genes in foxes.

I'm not going to repeat all the problems that developed with this scenario. Suffice it to say that there was no empirical evidence to start with; that no empirical evidence was ever uncovered; that, in fact, predator populations crash all the time; and that for group selection pressure to overcome a countervailing individual selection pressure, turned out to be very nearly mathematically impossible.

The theory having turned out to be completely incorrect, we may ask if, perhaps, the originators of the theory were doing something wrong.

continue reading »

A Failed Just-So Story

11 Eliezer_Yudkowsky 05 January 2008 06:35AM

Followup toRational vs. Scientific Ev-Psych, The Tragedy of Group Selectionism, Evolving to Extinction

Perhaps the real reason that evolutionary "just-so stories" got a bad name is that so many attempted stories are prima facie absurdities to serious students of the field.

As an example, consider a hypothesis I've heard a few times (though I didn't manage to dig up an example).  The one says:  Where does religion come from?  It appears to be a human universal, and to have its own emotion backing it - the emotion of religious faith.  Religion often involves costly sacrifices, even in hunter-gatherer tribes - why does it persist?  What selection pressure could there possibly be for religion?

So, the one concludes, religion must have evolved because it bound tribes closer together, and enabled them to defeat other tribes that didn't have religion.

This, of course, is a group selection argument - an individual sacrifice for a group benefit - and see the referenced posts if you're not familiar with the math, simulations, and observations which show that group selection arguments are extremely difficult to make work.  For example, a 3% individual fitness sacrifice which doubles the fitness of the tribe will fail to rise to universality, even under unrealistically liberal assumptions, if the tribe size is as large as fifty.  Tribes would need to have no more than 5 members if the individual fitness cost were 10%.  You can see at a glance from the sex ratio in human births that, in humans, individual selection pressures overwhelmingly dominate group selection pressures.  This is an example of what I mean by prima facie absurdity.

continue reading »

The Hidden Complexity of Wishes

60 Eliezer_Yudkowsky 24 November 2007 12:12AM

Followup toThe Tragedy of Group Selectionism, Fake Optimization Criteria, Terminal Values and Instrumental Values, Artificial Addition, Leaky Generalizations 

"I wish to live in the locations of my choice, in a physically healthy, uninjured, and apparently normal version of my current body containing my current mental state, a body which will heal from all injuries at a rate three sigmas faster than the average given the medical technology available to me, and which will be protected from any diseases, injuries or illnesses causing disability, pain, or degraded functionality or any sense, organ, or bodily function for more than ten days consecutively or fifteen days in any year..."
            -- The Open-Source Wish Project, Wish For Immortality 1.1

There are three kinds of genies:  Genies to whom you can safely say "I wish for you to do what I should wish for"; genies for which no wish is safe; and genies that aren't very powerful or intelligent.

continue reading »

Conjuring An Evolution To Serve You

40 Eliezer_Yudkowsky 19 November 2007 05:55AM

GreyThumb.blog offers an interesting analogue between research on animal breeding and the fall of Enron.  Before 1995, the way animal breeding worked was that you would take the top individual performers in each generation and breed from them, or their parents.  A cockerel doesn't lay eggs, so you have to observe daughter hens to determine which cockerels to breed.  Sounds logical, right?  If you take the hens who lay the most eggs in each generation, and breed from them, you should get hens who lay more and more eggs.

Behold the awesome power of making evolution work for you!  The power that made butterflies - now constrained to your own purposes!  And it worked, too.  Per-cow milk output in the US doubled between 1905 and 1965, and has doubled again since then.

Yet conjuring Azathoth oft has unintended consequences, as some researchers realized in the 1990s.  In the real world, sometimes you have more than animal per farm.  You see the problem, right?  If you don't, you should probably think twice before trying to conjure an evolution to serve you - magic is not for the unparanoid.

continue reading »

Fake Optimization Criteria

30 Eliezer_Yudkowsky 10 November 2007 12:10AM

Followup to:  Fake Justification, The Tragedy of Group Selectionism

I've previously dwelt in considerable length upon forms of rationalization whereby our beliefs appear to match the evidence much more strongly than they actually do.  And I'm not overemphasizing the point, either.  If we could beat this fundamental metabias and see what every hypothesis really predicted, we would be able to recover from almost any other error of fact.

The mirror challenge for decision theory is seeing which option a choice criterion really endorses.  If your stated moral principles call for you to provide laptops to everyone, does that really endorse buying a $1 million gem-studded laptop for yourself, or spending the same money on shipping 5000 OLPCs?

We seem to have evolved a knack for arguing that practically any goal implies practically any action.  A phlogiston theorist explaining why magnesium gains weight when burned has nothing on an Inquisitor explaining why God's infinite love for all His children requires burning some of them at the stake.

There's no mystery about this.  Politics was a feature of the ancestral environment.  We are descended from those who argued most persuasively that the good of the tribe meant executing their hated rival Uglak.  (We sure ain't descended from Uglak.) 

continue reading »

The Tragedy of Group Selectionism

36 Eliezer_Yudkowsky 07 November 2007 07:47AM

Before 1966, it was not unusual to see serious biologists advocating evolutionary hypotheses that we would now regard as magical thinking.  These muddled notions played an important historical role in the development of later evolutionary theory, error calling forth correction; like the folly of English kings provoking into existence the Magna Carta and constitutional democracy.

As an example of romance, Vero Wynne-Edwards, Warder Allee, and J. L. Brereton, among others, believed that predators would voluntarily restrain their breeding to avoid overpopulating their habitat and exhausting the prey population.

But evolution does not open the floodgates to arbitrary purposes.  You cannot explain a rattlesnake's rattle by saying that it exists to benefit other animals who would otherwise be bitten.  No outside Evolution Fairy decides when a gene ought to be promoted; the gene's effect must somehow directly cause the gene to be more prevalent in the next generation.  It's clear why our human sense of aesthetics, witnessing a population crash of foxes who've eaten all the rabbits, cries "Something should've been done!"  But how would a gene complex for restraining reproduction—of all things!—cause itself to become more frequent in the next generation?

continue reading »