Less Wrong is a community blog devoted to refining the art of human rationality. Please visit our About page for more information.

Why Support the Underdog?

35 Post author: Yvain 05 April 2009 12:01AM

One of the strangest human biases is the almost universal tendency to support the underdog.

I say "human" because even though Americans like to identify themselves as particular friends of the underdog, you can find a little of it everywhere. Anyone who's watched anime knows the Japanese have it. Anyone who's read the Bible knows the Israelites had it (no one was rooting for Goliath!) From mythology to literature to politics to sports, it keeps coming up.

I say "universal" because it doesn't just affect silly things like sports teams. Some psychologists did a study where they showed participants two maps of Israel: one showing it as a large country surrounding the small Palestinian enclaves, and the other showing it as a tiny island in the middle of the hostile Arab world. In the "Palestinians as underdogs" condition, 55% said they supported Palestine. In the "Israelis as underdogs" condition, 75% said they supported Israel. Yes, you can change opinion thirty points by altering perceived underdog status. By comparison, my informal experiments trying to teach people relevant facts about the region's history changed opinion approximately zero percent.

(Oh, and the Israelis and Palestinians know this. That's why the propaganda handbooks they give to their respective supporters - of course they give their supporters propaganda handbooks! - specifically suggest the supporters portray their chosen cause as an underdog. It's also why every time BBC or someone shows a clip about the region, they get complaints from people who thought it didn't make their chosen side seem weak enough!)

And there aren't many mitigating factors. Even when the underdog is obviously completely doomed, we still identify with them: witness Leonidas at Thermopylae. Even when the underdog is evil and the powerful faction is good, we can still feel a little sympathy for them; I remember some of my friends and I talking about bin Laden, and admitting that although he was clearly an evil terrorist scumbag, there was still something sort of awesome about a guy who could take on the entire western world from a cave somewhere.

I say "strangest" because I can't make heads or tails of why evolutionary psychology would allow it. Let's say Zug and Urk are battling it out for supremacy of your hunter-gatherer tribe. Urk comes to you and says "Hey, my faction is really weak. We don't have a chance against Zug, who is much stronger than us. I think we will probably be defeated and humiliated, and our property divided up among Zug's supporters."

The purely rational response seems to be "Wow, thanks for warning me, I'll go join Zug's side right now. Riches and high status as part of the winning faction, here I come!"

Now, many of us probably would join Zug's side. But introspection would tell us we were opposing rational calculation on Zug's side to a native, preconscious support for Urk. Why? The native preconscious part of our brain is usually the one that's really good at ending up on top in tribal power struggles. This sort of thing goes against everything it usually stands for.

I can think of a few explanations, none of them satisfying. First, it could be a mechanism to prevent any one person from getting too powerful. Problem is, this sounds kind of like group selection. Maybe the group does best if there's no one dictator, but from an individual point of view, the best thing to do in a group with a powerful dictator is get on that dictator's good side. Any single individual who initiates the strategy of supporting the underdog gets crushed by all the other people who are still on the dictator's team.

Second, it could be a mechanism to go where the rewards are highest. If a hundred people support Zug, and only ten people support Urk, then you have a chance to become one of Urk's top lieutenants, with all the high status and reproductive opportunities that implies if Urk wins. But I don't like this explanation either. When there's a big disparity in faction sizes, you have no chance of winning, and when there's a small disparity in faction sizes, you don't gain much by siding with the smaller faction. And as size differential between groups increases, the smaller faction's chance of success should drop much more quickly than the opportunities for status with the smaller faction should rise.

So I admit it. I'm stumped. What does Less Wrong think?

Comments (86)

Comment author: Nominull 05 April 2009 05:37:29AM 14 points [-]

When you see two enemies fighting, you want them both to use up as many resources as possible. That way, the winner will be easy pickings for you. You accomplish this by supporting whoever is weaker. This is the sort of strategy that pops up in many multiplayer board games.

Comment author: AlanCrowe 05 April 2009 12:46:28PM 16 points [-]

At the Go club, some-one asked about using red, green, and blue stones instead of using black and white. The chap who is doing a PhD in game theory said: the two weakest players will gang up on the strongest player, *just like any truel".

I was surprised by the way he spoke immediately without being distracted from his own game. Study long enough and hard enough and it becomes automatic: gang up on the stronger.

Now humans have an intuitive grasp of social games, which raises the question: what would that algorithm feel like from the inside? Perhaps it gets expressed as sympathy for the underdog?

It might be possible to test this hypothesis. A truel is a three player game that turns into a duel after one player has been eliminated. That is why you side with the weaker of your two opponents. The experimental psychologist setting up his experiment can manipulate the framing. It the game theory idea is correct, sympathy for the underdog should be stronger when the framing primes the idea of a follow on duel.

For example if you frame America versus bin Laden as the battle of two totalising ideologies, will the world be dog-eat-dog Capitalist or beard-and-burka Islamic, that should boost underdog-sympathy. If you frame America versus bin Laden as pure tragedy, "Americans just want to stay at home eating, bin Laden really wanted to stay in Mecca praying, how came they ended up fighting?", that should weaken underdog sympathy.

I'm not sure how to set up such an experiment. May it could be presented as research into writing dialogue for the theatre. The experimental subject is presented with scripts for plays. In one play the quarreling characters see rivals they are discussing (e.g. Israel v Palestine, etc) as expansionist, in another play the quarreling characters see the rivals they are discussing as fated to fight then go home. The experimental subject is probed with various lines of dialogue for the characters, which either sympathise with the underdog or the overdog, and asked to judge which seems natural.

The hypothesis is that it is the characters that anticipate a follow on duel whose dialogue feels natural with sympathy for the underdog.

Comment author: andrewc 05 April 2009 11:56:18PM 3 points [-]

Interesting idea: we support the underdog because if push came to shove we'd have a better chance of besting them than the top dog? There's a similar problem I remember from a kids brainteaser book. Three hunters are fighting a duel, with rifles, to the death. Each has one bullet. The first hunter has a 100% chance of making a killing shot, the second a 50% chance, the third a 10% chance. What is the inferior hunter's best strategy?

Comment author: Larks 16 August 2009 08:07:58PM *  0 points [-]

The normal answer (fire away from either) only works if we assume the other hunters are vindictive, rather than rational. If we assume they behave rationally, then the third hunter should target the best.

Comment author: Broggly 03 November 2010 02:06:45PM 0 points [-]

Sure, if you're acting simultaneously If you're taking turns and you kill the best, then the mid-strength hunter will immediately fire on you. However if one of them shoots the other, then you'll have the first shot against the remaining one.

Comment author: Larks 03 November 2010 04:04:32PM 0 points [-]

Yes, you're right. Larks@2009 hadn't studied any maths.

Comment author: abramdemski 05 April 2009 01:25:09AM *  10 points [-]

The following argument comes from an intro sociology text:

If there are three people competing, all of different strengths, it is worthwhile for the two weakest people to ban together to defeat the strongest person. This takes out the largest threat. (Specific game-theoretic assumptions were not stated.)

Doesn't this basically explain the phenomenon? If Zug kills Urk, I might be next! So I should ban together with Urk to defeat Zug. Even if Urk doesn't reward me at all for the help, my chances against Urk are better than my chances against Zug. (Under certain assumptions.)

Comment author: PhilGoetz 05 April 2009 02:50:37AM 5 points [-]

Yes, this was my first thought too. Yvain thought of it and said

First, it could be a mechanism to prevent any one person from getting too powerful. Problem is, this sounds kind of like group selection.

It doesn't sound like group selection to me. How does it harm the group for one person to get very powerful? It is individual selection. When one man or small group dominates the tribe completely, and doesn't need your help, you don't get any of the good women.

BTW, EO Wilson has a book out supporting group selection.

"Among the Yanomamo" describes several hunter-gatherer bands. They all (my recollection) had leading men, but in the dysfunctional bands, the leading men were extremely powerful and there was no balance of power. They and their group of about a dozen supporters ruled through fear and exploited the rest of the band shamelessly. Life for those who were not in the key dozen at the top was significantly worse than life for people in the villages that had a different power structure; or at least different personalities in charge who didn't exhibit endless greed.

(A side note about group selection: This social pattern repeated itself in the groups that spawned off the original "infected" dysfunctional group; and all the other bands in the area hated and feared these dysfunctional groups. They were all aware that these particular bands were "sick" and dangerous and that it would be nice to wipe them out. Sounds like prime territory for some group selection.)

Comment author: abramdemski 05 April 2009 09:29:12PM 1 point [-]

I agree, Yvain said it first, and it doesn't sound like group selection.

Concerning your group selection comment, that does sound plausible... but being relatively unfamiliar with tribal behavior, I would want to be sure that greedy genes were not spreading between groups before concluding that group selection could actually occur.

Comment author: taw 05 April 2009 12:14:04AM 9 points [-]

It's totally your second explanation. The stronger faction doesn't need you - value of you joining them is really tiny. The weaker faction needs you a lot - if you joining significantly alters the balance of power, they will reward you significantly.

Because of this mechanics of power, both coalitions are close to 50:50, and it's almost always in your best interest to join the slightly smaller one. For empirical evidence look for any modern democracy, with either coalitions of parties (most of continental Europe), or of interest groups (USA). Coalitions tend to have no sense whatsoever - blacks and gays and labour and lawyers vs born-again Christians and rich people and rural poor and racists? Does it make any sense? Not at all, but the 50:50 balance is very close.

I believe without that much evidence (I've seen some mentioned in context of game theory, so I guess someone has it) that this kind of almost 50:50 coalition making is very common in tribal societies, so it might very well be very common in our ancestral environment. In which case sympathy for the underdog makes sense.

Also notice that this is just one of many forces, it will be decisive only in cases where the coalitions are almost even otherwise, just as predicted. If one coalition is far bigger than the other, or you're more aligned with one that the other, sympathy for the underdog won't be strong enough to overcome those.

Comment author: loqi 05 April 2009 12:28:00AM 5 points [-]

The stronger faction doesn't need you - value of you joining them is really tiny. The weaker faction needs you a lot.

But we're talking about a zero-sum situation. The stronger faction needs you not to join the weaker faction exactly as much as the weaker faction needs you to join.

Comment author: Zvi 05 April 2009 12:05:07PM 6 points [-]

You don't always have to join Zug or Urk. Often you can let them fight it out and remain neutral, or choose how much of your resources to commit to the fight. Urk needs everything you have, whereas Zug would be perfectly happy to see you do nothing and in most conflicts most people stay out of it. Because of this Zug can't afford to go around rewarding everyone for not joining Urk the same way Urk can reward you for joining him.

Comment author: SoullessAutomaton 05 April 2009 12:15:58AM *  22 points [-]

It occurs to me that there may be a critical difference between voicing sympathy for a weak faction, vs. actually joining it and sharing its misfortunes.

That is to say, a near-optimal strategy in Zug vs. Urk, assuming one is currently unaffiliated and not required to join either side, is to do as much as possible to support Urk without angering Zug and incurring penalties. As a latecomer you'd get little benefit from joining Zug anyways, but in the chance of a surprise upset, when Urk comes to power you will be more likely to benefit than uninvolved parties or active Zug supporters.

Comment author: Andy_McKenzie 05 April 2009 03:48:46PM 1 point [-]

If everybody in the tribe has this adaptation, then it will no longer be useful because everybody will be supporting the underdog. The optimal strategy, then, is not to support the underdog per se but instead to support the cause that less people support, factoring in the rough probabilities that both Zug and Urk have to win. How would this yield a systematic bias toward favoring the underdog? It would only occur if in the modern world we still suspect that the majority will favor the team more likely to win.

Comment author: Dojan 23 October 2013 02:34:22AM 1 point [-]

Well, this depends on what level the average player is playing at; but at every level there is going to be more noise, and thus less evolutionary pressure. My friend told me that his teacher had told his class that, in practice, most people play on the second or third levels. (I have nothing to back that up with, I know nothing about stock trading)

Comment author: InquilineKea 05 April 2009 01:32:41AM 5 points [-]

Might I add Dunbar's number to this? Large powerful groups have a tendency to split (especially hunter-gatherer ones). And once they split, they often become each other's enemies. Oftentimes, it's better for the individual to be the underdog when the underdog is a group that is less likely to split.

Alternatively, let's ponder this situation: you're part of a group, a single one of many possible groups. Your group has interests in supporting the weaker groups if your group wishes to survive (of course you may be okay with having your group absorbed into another group - but remember - in hunter gatherer days, it was often difficult to be absorbed in another group with an entirely different culture from yours).

This, incidentally, reminds me a lot of the scene in Romance of Three Kingdoms where the minor warlord Zhang Xiu was wondering whether to join Cao Cao (the underdog) or Yuan Shao. His brilliant adviser told him to join Cao Cao, who ultimately toppled Yuan Shao.

Comment author: thomblake 07 April 2009 02:15:21PM 1 point [-]

This, incidentally, reminds me a lot of the scene in Romance of Three Kingdoms where the minor warlord Zhang Xiu was wondering whether to join Cao Cao (the underdog) or Yuan Shao. His brilliant adviser told him to join Cao Cao, who ultimately toppled Yuan Shao.

Yes, I think this is exactly the sort of truel situation that is talked about elsewhere.

Comment author: MBlume 05 April 2009 08:24:10AM 19 points [-]

My friend Cheryl suggests a non ev-psych response. Each of us is, in many senses, an underdog. We are out of the ancestral environment, and are part of societies that are too darn large. We feel like underdogs, and so when we see another, we perceive a similarity of circumstance which enhances our feelings of sympathy.

Comment author: gworley 05 April 2009 03:18:41PM 3 points [-]

I like this idea. When we have to stretch too far to look for an explanation of a trait based only on that trait's effect on differential reproduction, it may be because there is no such explanation. Plenty of traits are the result of side effects that did not affect reproduction, and others may be cultural.

This idea has just what we need: it fits the experience, doesn't seem to affect reproduction, and is a side effect of sexually selected traits. When you add in a cultural component that may amplify or suppress this feeling of sympathy, you have what looks like a good explanation with no "just so"s necessary.

Comment author: orthonormal 05 April 2009 04:48:46PM 6 points [-]

Children's social worlds aren't as large as adults', so one prediction this model makes is that children raised in small social worlds (homeschooling or other small communities) should have much less of an underdog bias than adults or children who interact with many strangers.

Intuitively, I'd say that's probably not the case; but it bears testing.

Comment author: gworley 06 April 2009 11:59:07AM 3 points [-]

Maybe, but what about when those children discover that they are outside the norm? I'd imagine they might even be more likely to favor underdogs once they realize that they share the commonality of standing against the norm in some fashion.

Comment author: Andy_McKenzie 05 April 2009 03:40:16PM 1 point [-]

I like this idea too. One prediction from it seems to be that those who feel less like underdogs (such as a Saudi Prince) will support underdogs less. One might find those who feel less like underdogs viageneral socieconomic status too, but since we have a fairly egalitarian society high income people might actually be more likely to have considered themselves an underdog during their formative years.

Comment author: Eliezer_Yudkowsky 05 April 2009 12:32:20AM 17 points [-]

1) If Zug wins, they'll be a stronger threat to you than Urk. Hunter-gatherer tribes have a carefully maintained balance of power - chieftains are mostly an invention of agriculture.

2) "When I face an issue of great import that cleaves both constituents and colleagues, I always take the same approach. I engage in deep deliberation and quiet contemplation. I wait to the last available minute and then I always vote with the losers. Because, my friend, the winners never remember and the losers never forget." -- Sen. Everett Dirksen

Comment author: SoullessAutomaton 05 April 2009 01:24:25AM 2 points [-]

Good explanations, but a couple quibbles:

1) This explanation seems to presume that the disutility of "Zug wins" is of larger magnitude than the disutility of "Allied with the losing side" proportional to the likelihood of Zug winning. This is not necessarily implausible, but is it likely to have been sufficiently common to exert selection pressure?

2) This explanation presumes that Urk retains sufficient influence after a failed bid for power that the disutility of "Urk hates your stinking guts" is larger than the disutility of "Allied with the losing side". Clearly the case in the Senate, but elsewhere?

Comment author: bogdanb 05 April 2009 11:03:04AM *  5 points [-]

The central part of Eliezer's comment, in my reading, is that for the vast majority of the time humans evolved they were in a hunter-gatherer tribe format, where the group size was low (other research discussed here indicate an upper-bound of around 50).

In such groups it seems plausible that status “victories” are not absolute, and the power difference between the large and little side is rarely huge. Also, the links between members of two factions are very tight—they all know each other, they're closely related biologically, and they depend on each other tightly for survival.

Some examples: It's unlikely that in a 30/20, or even 40/10 split, the loosing side is massacred: it's still a large fraction of the group, and loosing it completely would reduce the group's survivability. Also, its members are probably children or siblings of members of the winning side, so even if Grog supports Zug because he seems like a better hunter, Grog'll be upset if Zug kills his son Trok, who sided with Urk because he's younger.

The balance of power can slide easily, for instance if Zug gets older, or if he's injured in a hunt. (Actually, it seems common enough that in all status-organized “societies”, including wolves and lions, that the leader is often challenged by “underdogs”, one of which will eventually become leader. Which is why challenges are rarely lethal.)

Our intuition (for judging the sides and such) is shaped in a large part by current society sizes (e.g., “my vote doesn't matter”), because it's a neural process, but instincts are probably still predominantly shaped around few-dozen-person group sizes, since it's genetics based.

EDIT: Another point: underdogs in the ancestral environment would tend to be the younger side. Which means a child or a niece or something like that. Which means that the incentive to help them is a bit stronger than just group selection.

Comment author: Grognor 11 February 2012 05:17:37PM 0 points [-]

even if Grog supports Zug because he seems like a better hunter, Grog'll be upset if Zug kills his son Trok, who sided with Urk because he's younger.

Neither Grog nor Grognor would allow his own son to die to such an undignified neophyte as Zug. Then again, who does Trok think he is, going against his father like that?

Comment author: Andy_McKenzie 05 April 2009 03:32:28PM 4 points [-]

How about Terror Management Theory? By supporting a cause that is probably going to win anyway, we gain little. But by supporting an unlikely cause such as Leonidas at Thermopylae, there is an increased possibility that if we succeed our accomplishments will live on past us, because it is so incredible. In this way, we would become immortal. One prediction from this explanation is that the greater the disparity between the underdog and the overdog the larger the preference towards the underdog will be, which seems to be backed up empirically (see the increased preference for Slovenia vs. Sweden in the referenced study).

Comment author: Jack 06 April 2009 03:50:41AM *  1 point [-]

"By supporting a cause that is probably going to win anyway, we gain little. But by supporting an unlikely cause such as Leonidas at Thermopylae, there is an increased possibility that if we succeed our accomplishments will live on past us, because it is so incredible."

There are a couple problems with this. First, we might join Leonidas on those grounds but why would we root for him on those grounds? We're not going to be remembered that way. Second, if one wants to be remembered one is probably best off just being on the side of the winners. Winners write history. Finally, this could explain the motivation of the underdog but I don't think it explains the way we seem to be wired to root for the underdog (either biologically or culturally).

Comment author: swestrup 05 April 2009 03:52:56AM 4 points [-]

My first thought was to assume it was part of the whole alpha-male dominance thing. Any male that wants to achieve the status of alpha-male starts out in a position of being an underdog and facing an entrenched opposition with all of the advantages of resources.

But, of course, alpha-males outperform when it comes to breeding success and so most genes are descended from males that have confronted this situation, strove against "impossible" odds, and ultimately won.

Of course, if this is the explanation, then one would expect there to be a strong difference in how males and females react to the appearance of an underdog.

Comment author: gwern 05 April 2009 01:00:13AM *  4 points [-]

The proffered explanations seem plausible. What about with ideas though? I think it's social signaling: 'Look how clever and independent and different I am, that I can adopt this minority viewpoint and justify it.'

(Kind of like Zahavi's handicap principle.)

EDIT: It appears I largely stole this variant on signaling strategy from http://www.overcomingbias.com/2008/12/showoff-bias.html . Oh well.

Comment author: Yvain 05 April 2009 01:20:46AM 10 points [-]

Your mention of signaling gives me an idea.

What if the mechanism isn't designed to actually support the underdog, but to signal a tendency to support the underdog?

In a world where everyone supports the likely winner, Zug doesn't need to promise anyone anything to keep them on his side. But if one person suddenly develops a tendency to support the underdog, then Zug has to keep him loyal by promising him extra rewards.

The best possible case is one where you end up on Zug's side, but only after vacillating for so long that Zug is terrified you're going to side with Urk and promises everything in his power to win you over. And the only way to terrify Zug that way is to actually side with Urk sometimes.

Comment author: RobinHanson 05 April 2009 03:30:13AM 14 points [-]

It seems that supporting an underdog is a more impressive act - it suggests more confidence in your own abilities, and your ability to withstand retribution from the overdog. I'm not sure we do actually support the underdog more when a costly act is required, but we probably try to pretend to support the underdog when doing so is cheap, so we can look more impressive.

Comment author: SoullessAutomaton 05 April 2009 01:35:35AM 4 points [-]

In other words, if Zug believes you to be the kind of agent who will make the naively rational decision to side with him, he will not reward you. You then side with Zug, because it makes more sense.

However, if Zug believes you to be the kind of agent who will irrationally oppose him unless bribed, he will reward you. You then side with Zug, because it makes more sense.

This seems to be another problem of precommitment.

Comment author: Eliezer_Yudkowsky 05 April 2009 05:54:54AM 4 points [-]

While my own decision theory has no need of precommitment, it's interesting to consider that genes have no trouble with precommitments; they just make us want to do it that way. The urge to revenge, for example, can be considered as the genes making a sort of believable and true precommitment; you don't reconsider afterward, once you get the benefits, because - thanks to the genes - it's what you want. The genes don't have quite the same calculus as an inconsistent classical decision theorist who knows beforehand that they want to precommit early but will want to reconsider later.

Comment author: loqi 05 April 2009 01:45:04AM 1 point [-]

But Zug probably doesn't care about just one person. Doesn't the underdog bias still require a way to "get off the ground" in this scenario? Siding with Urk initially flies in the face of individual selection.

Comment author: Eliezer_Yudkowsky 05 April 2009 05:51:56AM 4 points [-]

Zug can be only slightly more powerful than Urk to start with, and then as more individuals have the adaptation, the power difference it's willing to confront will scale. I.e. this sounds like it could evolve incrementally.

Comment author: loqi 05 April 2009 06:02:16AM 1 point [-]

Ah, makes sense. The modern bias seems specifically connected to major differences, but that doesn't exclude milder origins.

Comment author: SoullessAutomaton 05 April 2009 01:08:31AM 4 points [-]

Social signalling explains almost everything and predicts little. By law of parsimony, supporting underdog ideas seems much likelier to me as a special case of the general tendency Yvain is considering.

Comment author: AnnaSalamon 05 April 2009 04:37:09AM *  8 points [-]

Social signalling explains almost everything and predicts little.

In this case, the social signaling interpretation predicts a discrepancy between peoples' expressed preferences in distant situations, and peoples' felt responses in situations where they can act.

We can acquire evidence for or against the social signaling interpretation by e.g. taking an "underdog" scene, where a popular kid fights with a lone unpopular kid, and having two randomized groups of kids (both strangers to the fighters): (a) actually see the fight, as if by accident, nearby where they can in principle intercede; or (b) watch video footage of the fight, as a distant event that happened long ago and that they are being asked to comment on. Watch the Eckman expressions of the kids in each group, and see if the tendency to empathize with the underdog is stronger when signaling is the only issue (for group (b)) than when action is also a possibility (for group (a)). A single experiment of this sort wouldn't be decisive, but with enough variations it might.

Comment author: cousin_it 05 April 2009 12:18:32PM *  1 point [-]

Your experiment wouldn't convince me at all because the video vs reality distinction could confound it any number of ways. That said, I upvoted you because no one else here has even proposed a test.

Comment author: gjm 05 April 2009 12:40:25AM 8 points [-]

Here's another explanation (a bit like taw's). I don't find it terribly convincing either, but I don't see an outright refutation.

Suppose you have kin, or others whose welfare (in the relevant senses) is correlated with yours. Obviously you'll tend to help them. How much, and how urgently? Much more when they're in worse trouble. (As taw says, when they're in a strong position they don't need your help, so most likely your own more direct interests matter more to you.) So there's value in having a mechanism that makes you care more about people you'd have cared about anyway when they're underdogs.

Well, evolution tends to produce hacks layered on hacks, so maybe the mechanism we actually got was one for making you care about everyone more when they're underdogs. When they're random strangers, the effect isn't strong enough to make you do much more than think "oh, I hope they survive"; if they're actually enemies, it isn't strong enough to make you switch sides (Yvain and his friends didn't actually start sending money to al Qaeda just because there's something a bit awesome about taking on the whole of Western civilization from a cave in Afghanistan). But when it's someone whose welfare you really care about, it can make the difference between acting and not acting.

Note that it's beneficial (evolutionarily, I mean) to have such a reaction not only for close kin but whenever the underdog is closer to you than the oppressors. For instance, some random person is being attacked by wolves: your genes benefit (in competition with the wolves') if you help them survive.

Comment author: jimmy 05 April 2009 02:05:54AM 3 points [-]

Depending on the group size, the underdog might not be the underdog anymore with your support.

If it's a small group thing (or you have significant power) it is likely that you can determine which side wins.

The underdogs may have more at stake than the winners, and would be willing to give more in return for help. If Bob steals half of Fred's bananas every day, Bob gets to be a little better fed, and Fred dies.

If you help Fred out, he owes you his life, but Bob doesn't care nearly as much if it just means he has to go back to eating only his own bananas (that or you kill him).

If you choose to help Bob, your help isn't worth anything since he had it under control anyway.

Comment author: orthonormal 05 April 2009 03:47:56AM *  2 points [-]

I think this instinct may in fact be evolutionarily optimized for conflicts between individuals; in most group conflicts in the ancestral environment, you probably already belong to one of the sides.

But yes, it does seem to generalize too readily to conflicts where you personally wouldn't sway the balance.

EDIT: How could we test any of the above theories? My theory seems to predict that describing the conflict as "one single entity versus another" (and triggering modes of thought optimized for third parties to single combat) will give a stronger underdog bias than describing a collection of entities on each side (with one collection much larger than the other).

Comment author: AlexU 05 April 2009 12:54:45AM *  3 points [-]

In a confrontation between two parties, it's more likely that the stronger one will pose the greater threat to you. By supporting the underdog and hoping for a fluke victory, you're increasing your own survival odds. It seems we're probably evolved to seek parity -- where we then have the best chance of dominating -- instead of seeking dominant leaders and siding with them, which is a far more complex and less certain process.

Am I missing something? Also, it would be interesting to see whether females and males have the same reactions toward the overdog.

Comment author: steven0461 05 April 2009 12:58:28AM 2 points [-]

The problem with things like "seeking parity" is that your actions play only a small part in determining the outcome of the conflict, whereas your actions play a much larger part in determining consequences to your post-conflict status.

Comment author: Eliezer_Yudkowsky 05 April 2009 05:57:44AM 6 points [-]

Not if others also side with the underdog, and punish those who side with the overdog - perhaps by viewing them as "craven" or "toadying" and treating them accordingly. People seem to have an odd respect for supervillains, but do we respect the henchmen?

Comment author: nescius 05 April 2009 10:09:56PM 0 points [-]

I also wonder about possible sex differences. Some information is available:

The Appeal Of The Underdog:

There was no significant effect, t(69) = 1.30, p = .19, though caution is warranted because of imbalanced samples. In fact, across all four studies reported in this article, there were no sex differences on the main dependent variables (all ps > .19).

Comment author: timtyler 05 April 2009 09:55:00AM 6 points [-]

Partly our empathy circuits. Humans like to help - and like to be seen to be helping. The underdog is the party that most obviously needs assistance.

Comment author: Psychohistorian 05 April 2009 06:48:29PM 7 points [-]

I'm not sure the evidence of the proposed bias supports the type of ev-psych responses being offered.

The only cases I'm aware of underdog bias actually mattering are of the Israel-Palestine type, not the Zug-Urk type. I-P poses no significant costs or benefit to the individual. Z-U poses tremendous costs or benefits to the individual. I don't imagine I-P type support decisions meaningfully affect reproductive success. Unless there's evidence that people still side with the underdog when it really costs them something, these ev-psych explanations seem to be explaining something that doesn't happen.

I would posit that it's cultural, and it's fictional availability bias. In all of our stories, the underdog is invariably the good guy. It seems very difficult to tell a story about good giant multinational corporation beats evil little old lady. The reverse has been quite successful. Consequently, we tend to side with the underdog because we generalize from a great deal of fictional evidence that "proves" that the underdog is the good guy. This also explains why we stick with an underdog even when he ceases to be an underdog, as this is a typical pivot point in a story.

This raises the question of why this kind of story is so successful, which I admit I don't have a great answer to.

Comment author: Jack 06 April 2009 03:42:07AM 4 points [-]

It doesn't just raise the question, it begs the question.

Comment author: Psychohistorian 06 April 2009 08:59:27PM 1 point [-]

Not really. 'Why does "underdog beats overdog" make a more interesting story that "overdog beats underdog" 'is a very different question from 'Why do we tend to side with the underdog when no costs are imposed on us?'

Providing an alternative mechanism but not being able to fully explain its causes is hardly begging the question.

Comment author: Jack 07 April 2009 08:23:42PM 3 points [-]

Yes, the following i true "Why does "underdog beats overdog" make a more interesting story that "overdog beats underdog" 'is a very different question from 'Why do we tend to side with the underdog when no costs are imposed on us?'

But your distinguishing two questions that weren't distinguished in either your comment or the post. The post asks why we tend to support the underdog. In the initial post the "supporting" consists of verbally saying "I support x" and then, later, identifying with the underdog in a story (i.e. Leonidas and bin Laden). You come back and say well look, maybe our selection of fiction leads us to think the underdog must be the good guy. But as I understood the initial question part of what we were seeking to learn was why we identify with the underdogs in stories. I take identifying with a fictional character to be equivalent to "siding with them without an imposed cost".

So as I took the initial question your explanation for some of the underdog phenomenon merely attributes the cause to other parts of the phenomenon and fails to get at the root cause. Indeed, nearly every significant pattern in human behavior will have been documented in fiction, so of one could claim fictional availability bias about lots of things (mating rituals, morality, language use, etc.) but its all chicken and egg until you explain HOW THE FICTION GOT THAT WAY.

Comment author: thomblake 07 April 2009 08:35:34PM 2 points [-]

That's not begging the question. I don't see an argument being made with the conclusion as a premise. Perhaps you could be more explicit and concise?

That "underdog beats overdog" makes an interesting story does not require that we side with the underdog. Just like "dog bites man" is less interesting than "man bites dog", regardless of who you side with.

Comment author: Jack 08 April 2009 04:07:59AM 0 points [-]
  1. We side with the underdog. 1A. Polling on Israel-Palestine shows a shift in support given to the side that appears to be the underdog. 1B. Despite being evil we sort of think bin Laden is cool for taking on the US by himself. 1C. When we tell stories we tend to identify with and root for the underdog, i.e. Leonidas.

When we want to know why (1.) I take it that any explanation that includes any of the sub-premises is question begging.

Psychohistorian's response was that (1) is caused by the fact that in our stories the underdog is always the side we identify with and root for and this leads us to assume that the underdog is the "good side" and therefore side with the underdog. But as I took the question (1) part of what needed explaining was underdog identification in stories.

This mess about what makes an "interesting story" was added after the initial comment and it confuses things. As I took the initial comment the only evidence being presented was the vast collection of pro-underdog stories and the dearth of pro-overdog stories and this was taken to be sufficient lead us to side with the underdog. I don't think this response is especially helpful because part of our reason for even thinking that there is an underdog bias is the fiction. Throwing in "interesting" adds another step to the argument and this version might not be begging the question anymore (though I'm not convinced of that either).

Comment author: thomblake 07 April 2009 02:05:02PM 1 point [-]

He might have meant 'begs the question' in the colloquial sense, which people really should stop doing.

Comment author: Jack 07 April 2009 08:24:09PM 2 points [-]

If I had meant this the comment would have made no sense.

Comment author: rwallace 05 April 2009 01:28:33PM 5 points [-]

It's worth bearing in mind how people actually behave: if Zug is so powerful and vengeful that opposing him would be flat-out suicide, people don't. They may quietly nurse grudges and wait for misfortune to befall him, but they don't take overt action. Siding with Urk is a lot more understandable once we note that people only actually do it when it is reasonably safe to do so.

Comment author: Mario 05 April 2009 09:49:30AM 4 points [-]

I don't think it is necessarily true that merely by joining the faction most likely to win you will share in the spoils of victory. Leaders distribute rewards based on seniority more than support. In a close contest, you would likely be courted heavily by both sides, providing a temporary boost in status, but that would disappear once the conflict is over. You will have not earned the trust of the winner since your allegiance was in doubt. I don't think there is much to gain by joining the larger side late; you'll be on the bottom of society once the dust settles, trusted by neither the winners nor the losers.

In cases like this, I think the operative value evolution would select for is not political success but sexual success. Being one of many followers does nothing to advertise ourselves as desirable mates. On the other hand, bravely fighting a losing battle (as long as you don't die in the process) signals both physical prowess (which you may not get in a lopsided victory) and other desirable traits, like courage. When the battle is over, one can assume that more money and women would be distributed to the new elite, but their children will be yours.

Comment author: AspiringKnitter 25 January 2012 03:06:16AM 1 point [-]

That should predict this bias to be stronger in men. After all, more partners, past a certain point, isn't really helpful to women's reproductive success, plus I'd be surprised if men sought courageous mates (if they go and get themselves killed before your baby is born...). So, is this bias stronger in men?

Comment author: simplicio 12 July 2013 09:20:13PM *  2 points [-]

Theory: supporting the underdog is a relatively costless longshot bet. Prediction: it will primarily occur in situations when opposing the overdog (verbally) can be done with impunity or secretly.

Overdog wins: no real consequences.

Underdog wins: "I supported you from the beginning! Can I be your trusted lieutenant?"

Comment author: Kenny 12 April 2009 06:13:28PM 2 points [-]

No one supports the underdog if they're a member, or a fan, of the overdog – only the unaffiliated are free to root for the underdog.

Comment author: RobinHanson 05 April 2009 01:13:19PM 3 points [-]

There are a log of good thoughts in these comments, but they are scattered. I can see value in someone collecting them into an organized summary of the plausible arguments on this topic.

Comment author: teageegeepea 05 April 2009 11:05:53PM *  2 points [-]

This problem seems even to afflict Mencius Moldbug. His ideology of formalism seems to be based on ensuring absolute unquestionable authority in order to avoid any violence (whether used to overthrow an authority or cement the hold of an existing one). At the same time he tries to base the appeal of his reactionary narrative by pointing highlighting how reactionaries are "those who lost" (in the terms of William Appleman Williams, whom Mencius would rather not mention) and the strong horse is universalism/antinomianism.

Comment author: PhilGoetz 07 April 2009 04:33:35AM 1 point [-]

the strong horse is universalism/antinomianism.

What does that mean? The whole clause. And I don't understand why you equate universalism with antinomianism.

Comment author: Joe 14 July 2009 12:55:24AM 2 points [-]

Perhaps you figured this out since April, but the quoted clause makes sense in the context of Mencius' particular use of the terms "universalism" (roughly: what everyone in polite society believes these days in the West) which he categorizes as "antinomian", roughly: opposed to natural law.

Comment deleted 05 April 2009 05:37:29AM [-]
Comment deleted 05 April 2009 07:31:19AM [-]
Comment author: Eliezer_Yudkowsky 05 April 2009 09:08:45AM 2 points [-]

You don't understand evolutionary psychology. You also don't know how to support an argument. "Just don't like"? That is precisely that which is to be explained. Nor are they randomly dangerous.

Look, I'm sorry, but you're on the wrong blog here. Read if you like, of course, but I don't think you're ready to be commenting. This is why you are often voted down. Sorry.

Comment author: Hans 05 April 2009 11:50:30AM 4 points [-]

I read your comment and I immediately wanted to vote up Marshall's original comment. After all, he's the underdog being criticized and chased away by the founder and administrator of this blog.

In the end, I didn't, probably for equally irrational reasons.

Comment author: Eliezer_Yudkowsky 05 April 2009 01:11:24PM 3 points [-]


I have to say, that frame on the whole problem had never occurred to me. No wonder online communities have such a hard time developing membranes.

Comment author: ciphergoth 05 April 2009 07:12:30PM 4 points [-]

It's worse here, because for some reason when people like Marshall claim that "rationalist" means "treats any old crap like it was a worthy contribution", people here are sufficiently wary of confirmation bias to take it more seriously than it deserves.

Comment author: Eliezer_Yudkowsky 05 April 2009 07:27:27PM 1 point [-]

Yeah, I've noticed. If I were to make a list of the top 3 rationalist errors, they'd be overconfidence, overcomplication, and underconfidence.

Either that or there's some kind of ancient echo of protecting the underdog in effort to keep the tribal power balance.

Comment deleted 05 April 2009 12:36:13PM [-]
Comment author: JulianMorrison 05 April 2009 12:43:28PM 0 points [-]

OK, what's YOUR position, and how much do you know? Then Yvain can dump historical facts on you, and we'll see how far you shift and in what direction.

Comment deleted 05 April 2009 02:04:21PM *  [-]
Comment author: loqi 05 April 2009 07:26:31PM 2 points [-]

This is indeed a pretty utilitarian position. I think the objection you're likely to run into is that by evaluating the situation purely in terms of the present, it sweeps historic precedents under the rug.

Put another way, the "this conflict represents a risk, let's just cool it" argument can just as easily be made by any aggressor directly after initiating the conflict.

Comment author: Eliezer_Yudkowsky 05 April 2009 07:29:14PM 2 points [-]

Yup. If you don't punish aggressors and just demand "peace at any price" once the war starts, that peace sure won't last long.

Comment deleted 05 April 2009 08:36:17PM *  [-]
Comment author: JulianMorrison 05 April 2009 09:07:08PM *  0 points [-]

(I yesterday heard someone who ought to know say AI at human level, and not provably friendly, in 16 years. Yes my jaw hit the floor too.)

I hadn't thought of the "park it, we have bigger problems", or "park it, Omega will fix it" approach, but it might make sense. That raises the question, and I hope it's not treading to far into off-LW-topic: to what extent ought a reasoning person act as if they expected a gradual and incremental change in the status quo, and to what extent ought their planning to be dominated by expectation of large disruptions in the near future?

Comment deleted 05 April 2009 10:16:29PM [-]
Comment author: JulianMorrison 06 April 2009 12:03:07AM *  1 point [-]

The question I was struggling to articulate was more like: should I give credence to my own beliefs? How much? And how to deal with instinct that doesn't want to put AI and postmen in the same category of "real"?

Comment author: Eliezer_Yudkowsky 05 April 2009 09:15:57PM 0 points [-]

Who on Earth do you think ought to know that?

Comment author: JulianMorrison 05 April 2009 09:17:34PM 1 point [-]

Shane Legg, who was at London LW meetup.

Comment deleted 05 April 2009 10:13:19PM [-]
Comment author: JulianMorrison 05 April 2009 07:03:09PM 1 point [-]

Are you sure you're not playing "a deeply wise person doesn't pick sides, but scolds both for fighting"?