One of the strangest human biases is the almost universal tendency to support the underdog.

I say "human" because even though Americans like to identify themselves as particular friends of the underdog, you can find a little of it everywhere. Anyone who's watched anime knows the Japanese have it. Anyone who's read the Bible knows the Israelites had it (no one was rooting for Goliath!) From mythology to literature to politics to sports, it keeps coming up.

I say "universal" because it doesn't just affect silly things like sports teams. Some psychologists did a study where they showed participants two maps of Israel: one showing it as a large country surrounding the small Palestinian enclaves, and the other showing it as a tiny island in the middle of the hostile Arab world. In the "Palestinians as underdogs" condition, 55% said they supported Palestine. In the "Israelis as underdogs" condition, 75% said they supported Israel. Yes, you can change opinion thirty points by altering perceived underdog status. By comparison, my informal experiments trying to teach people relevant facts about the region's history changed opinion approximately zero percent.

(Oh, and the Israelis and Palestinians know this. That's why the propaganda handbooks they give to their respective supporters - of course they give their supporters propaganda handbooks! - specifically suggest the supporters portray their chosen cause as an underdog. It's also why every time BBC or someone shows a clip about the region, they get complaints from people who thought it didn't make their chosen side seem weak enough!)

And there aren't many mitigating factors. Even when the underdog is obviously completely doomed, we still identify with them: witness Leonidas at Thermopylae. Even when the underdog is evil and the powerful faction is good, we can still feel a little sympathy for them; I remember some of my friends and I talking about bin Laden, and admitting that although he was clearly an evil terrorist scumbag, there was still something sort of awesome about a guy who could take on the entire western world from a cave somewhere.

I say "strangest" because I can't make heads or tails of why evolutionary psychology would allow it. Let's say Zug and Urk are battling it out for supremacy of your hunter-gatherer tribe. Urk comes to you and says "Hey, my faction is really weak. We don't have a chance against Zug, who is much stronger than us. I think we will probably be defeated and humiliated, and our property divided up among Zug's supporters."

The purely rational response seems to be "Wow, thanks for warning me, I'll go join Zug's side right now. Riches and high status as part of the winning faction, here I come!"

Now, many of us probably would join Zug's side. But introspection would tell us we were opposing rational calculation on Zug's side to a native, preconscious support for Urk. Why? The native preconscious part of our brain is usually the one that's really good at ending up on top in tribal power struggles. This sort of thing goes against everything it usually stands for.

I can think of a few explanations, none of them satisfying. First, it could be a mechanism to prevent any one person from getting too powerful. Problem is, this sounds kind of like group selection. Maybe the group does best if there's no one dictator, but from an individual point of view, the best thing to do in a group with a powerful dictator is get on that dictator's good side. Any single individual who initiates the strategy of supporting the underdog gets crushed by all the other people who are still on the dictator's team.

Second, it could be a mechanism to go where the rewards are highest. If a hundred people support Zug, and only ten people support Urk, then you have a chance to become one of Urk's top lieutenants, with all the high status and reproductive opportunities that implies if Urk wins. But I don't like this explanation either. When there's a big disparity in faction sizes, you have no chance of winning, and when there's a small disparity in faction sizes, you don't gain much by siding with the smaller faction. And as size differential between groups increases, the smaller faction's chance of success should drop much more quickly than the opportunities for status with the smaller faction should rise.

So I admit it. I'm stumped. What does Less Wrong think?

New Comment
102 comments, sorted by Click to highlight new comments since:
Some comments are truncated due to high volume. (⌘F to expand all)Change truncation settings

1) If Zug wins, they'll be a stronger threat to you than Urk. Hunter-gatherer tribes have a carefully maintained balance of power - chieftains are mostly an invention of agriculture.

2) "When I face an issue of great import that cleaves both constituents and colleagues, I always take the same approach. I engage in deep deliberation and quiet contemplation. I wait to the last available minute and then I always vote with the losers. Because, my friend, the winners never remember and the losers never forget." -- Sen. Everett Dirksen

3SoullessAutomaton
Good explanations, but a couple quibbles: 1) This explanation seems to presume that the disutility of "Zug wins" is of larger magnitude than the disutility of "Allied with the losing side" proportional to the likelihood of Zug winning. This is not necessarily implausible, but is it likely to have been sufficiently common to exert selection pressure? 2) This explanation presumes that Urk retains sufficient influence after a failed bid for power that the disutility of "Urk hates your stinking guts" is larger than the disutility of "Allied with the losing side". Clearly the case in the Senate, but elsewhere?
9bogdanb
The central part of Eliezer's comment, in my reading, is that for the vast majority of the time humans evolved they were in a hunter-gatherer tribe format, where the group size was low (other research discussed here indicate an upper-bound of around 50). In such groups it seems plausible that status “victories” are not absolute, and the power difference between the large and little side is rarely huge. Also, the links between members of two factions are very tight—they all know each other, they're closely related biologically, and they depend on each other tightly for survival. Some examples: It's unlikely that in a 30/20, or even 40/10 split, the loosing side is massacred: it's still a large fraction of the group, and loosing it completely would reduce the group's survivability. Also, its members are probably children or siblings of members of the winning side, so even if Grog supports Zug because he seems like a better hunter, Grog'll be upset if Zug kills his son Trok, who sided with Urk because he's younger. The balance of power can slide easily, for instance if Zug gets older, or if he's injured in a hunt. (Actually, it seems common enough that in all status-organized “societies”, including wolves and lions, that the leader is often challenged by “underdogs”, one of which will eventually become leader. Which is why challenges are rarely lethal.) Our intuition (for judging the sides and such) is shaped in a large part by current society sizes (e.g., “my vote doesn't matter”), because it's a neural process, but instincts are probably still predominantly shaped around few-dozen-person group sizes, since it's genetics based. EDIT: Another point: underdogs in the ancestral environment would tend to be the younger side. Which means a child or a niece or something like that. Which means that the incentive to help them is a bit stronger than just group selection.
0Grognor
Neither Grog nor Grognor would allow his own son to die to such an undignified neophyte as Zug. Then again, who does Trok think he is, going against his father like that?

It occurs to me that there may be a critical difference between voicing sympathy for a weak faction, vs. actually joining it and sharing its misfortunes.

That is to say, a near-optimal strategy in Zug vs. Urk, assuming one is currently unaffiliated and not required to join either side, is to do as much as possible to support Urk without angering Zug and incurring penalties. As a latecomer you'd get little benefit from joining Zug anyways, but in the chance of a surprise upset, when Urk comes to power you will be more likely to benefit than uninvolved parties or active Zug supporters.

2Andy_McKenzie
If everybody in the tribe has this adaptation, then it will no longer be useful because everybody will be supporting the underdog. The optimal strategy, then, is not to support the underdog per se but instead to support the cause that less people support, factoring in the rough probabilities that both Zug and Urk have to win. How would this yield a systematic bias toward favoring the underdog? It would only occur if in the modern world we still suspect that the majority will favor the team more likely to win.
2Dojan
Well, this depends on what level the average player is playing at; but at every level there is going to be more noise, and thus less evolutionary pressure. My friend told me that his teacher had told his class that, in practice, most people play on the second or third levels. (I have nothing to back that up with, I know nothing about stock trading)
[-]MBlume210

My friend Cheryl suggests a non ev-psych response. Each of us is, in many senses, an underdog. We are out of the ancestral environment, and are part of societies that are too darn large. We feel like underdogs, and so when we see another, we perceive a similarity of circumstance which enhances our feelings of sympathy.

7orthonormal
Children's social worlds aren't as large as adults', so one prediction this model makes is that children raised in small social worlds (homeschooling or other small communities) should have much less of an underdog bias than adults or children who interact with many strangers. Intuitively, I'd say that's probably not the case; but it bears testing.
3Gordon Seidoh Worley
Maybe, but what about when those children discover that they are outside the norm? I'd imagine they might even be more likely to favor underdogs once they realize that they share the commonality of standing against the norm in some fashion.
4Gordon Seidoh Worley
I like this idea. When we have to stretch too far to look for an explanation of a trait based only on that trait's effect on differential reproduction, it may be because there is no such explanation. Plenty of traits are the result of side effects that did not affect reproduction, and others may be cultural. This idea has just what we need: it fits the experience, doesn't seem to affect reproduction, and is a side effect of sexually selected traits. When you add in a cultural component that may amplify or suppress this feeling of sympathy, you have what looks like a good explanation with no "just so"s necessary.
1Andy_McKenzie
I like this idea too. One prediction from it seems to be that those who feel less like underdogs (such as a Saudi Prince) will support underdogs less. One might find those who feel less like underdogs viageneral socieconomic status too, but since we have a fairly egalitarian society high income people might actually be more likely to have considered themselves an underdog during their formative years.

When you see two enemies fighting, you want them both to use up as many resources as possible. That way, the winner will be easy pickings for you. You accomplish this by supporting whoever is weaker. This is the sort of strategy that pops up in many multiplayer board games.

At the Go club, some-one asked about using red, green, and blue stones instead of using black and white. The chap who is doing a PhD in game theory said: the two weakest players will gang up on the strongest player, *just like any truel".

I was surprised by the way he spoke immediately without being distracted from his own game. Study long enough and hard enough and it becomes automatic: gang up on the stronger.

Now humans have an intuitive grasp of social games, which raises the question: what would that algorithm feel like from the inside? Perhaps it gets expressed as sympathy for the underdog?

It might be possible to test this hypothesis. A truel is a three player game that turns into a duel after one player has been eliminated. That is why you side with the weaker of your two opponents. The experimental psychologist setting up his experiment can manipulate the framing. It the game theory idea is correct, sympathy for the underdog should be stronger when the framing primes the idea of a follow on duel.

For example if you frame America versus bin Laden as the battle of two totalising ideologies, will the world be dog-eat-dog Capitalist or beard-and-burka Islamic, that should boo... (read more)

3andrewc
Interesting idea: we support the underdog because if push came to shove we'd have a better chance of besting them than the top dog? There's a similar problem I remember from a kids brainteaser book. Three hunters are fighting a duel, with rifles, to the death. Each has one bullet. The first hunter has a 100% chance of making a killing shot, the second a 50% chance, the third a 10% chance. What is the inferior hunter's best strategy?
0Larks
The normal answer (fire away from either) only works if we assume the other hunters are vindictive, rather than rational. If we assume they behave rationally, then the third hunter should target the best.
0Broggly
Sure, if you're acting simultaneously If you're taking turns and you kill the best, then the mid-strength hunter will immediately fire on you. However if one of them shoots the other, then you'll have the first shot against the remaining one.
0Larks
Yes, you're right. Larks@2009 hadn't studied any maths.
-10[anonymous]

The following argument comes from an intro sociology text:

If there are three people competing, all of different strengths, it is worthwhile for the two weakest people to ban together to defeat the strongest person. This takes out the largest threat. (Specific game-theoretic assumptions were not stated.)

Doesn't this basically explain the phenomenon? If Zug kills Urk, I might be next! So I should ban together with Urk to defeat Zug. Even if Urk doesn't reward me at all for the help, my chances against Urk are better than my chances against Zug. (Under certain assumptions.)

5PhilGoetz
Yes, this was my first thought too. Yvain thought of it and said It doesn't sound like group selection to me. How does it harm the group for one person to get very powerful? It is individual selection. When one man or small group dominates the tribe completely, and doesn't need your help, you don't get any of the good women. BTW, EO Wilson has a book out supporting group selection. "Among the Yanomamo" describes several hunter-gatherer bands. They all (my recollection) had leading men, but in the dysfunctional bands, the leading men were extremely powerful and there was no balance of power. They and their group of about a dozen supporters ruled through fear and exploited the rest of the band shamelessly. Life for those who were not in the key dozen at the top was significantly worse than life for people in the villages that had a different power structure; or at least different personalities in charge who didn't exhibit endless greed. (A side note about group selection: This social pattern repeated itself in the groups that spawned off the original "infected" dysfunctional group; and all the other bands in the area hated and feared these dysfunctional groups. They were all aware that these particular bands were "sick" and dangerous and that it would be nice to wipe them out. Sounds like prime territory for some group selection.)
1abramdemski
I agree, Yvain said it first, and it doesn't sound like group selection. Concerning your group selection comment, that does sound plausible... but being relatively unfamiliar with tribal behavior, I would want to be sure that greedy genes were not spreading between groups before concluding that group selection could actually occur.
[-]taw100

It's totally your second explanation. The stronger faction doesn't need you - value of you joining them is really tiny. The weaker faction needs you a lot - if you joining significantly alters the balance of power, they will reward you significantly.

Because of this mechanics of power, both coalitions are close to 50:50, and it's almost always in your best interest to join the slightly smaller one. For empirical evidence look for any modern democracy, with either coalitions of parties (most of continental Europe), or of interest groups (USA). Coalitions tend to have no sense whatsoever - blacks and gays and labour and lawyers vs born-again Christians and rich people and rural poor and racists? Does it make any sense? Not at all, but the 50:50 balance is very close.

I believe without that much evidence (I've seen some mentioned in context of game theory, so I guess someone has it) that this kind of almost 50:50 coalition making is very common in tribal societies, so it might very well be very common in our ancestral environment. In which case sympathy for the underdog makes sense.

Also notice that this is just one of many forces, it will be decisive only in cases where the coalitions are almost even otherwise, just as predicted. If one coalition is far bigger than the other, or you're more aligned with one that the other, sympathy for the underdog won't be strong enough to overcome those.

6loqi
But we're talking about a zero-sum situation. The stronger faction needs you not to join the weaker faction exactly as much as the weaker faction needs you to join.
8Zvi
You don't always have to join Zug or Urk. Often you can let them fight it out and remain neutral, or choose how much of your resources to commit to the fight. Urk needs everything you have, whereas Zug would be perfectly happy to see you do nothing and in most conflicts most people stay out of it. Because of this Zug can't afford to go around rewarding everyone for not joining Urk the same way Urk can reward you for joining him.
1[anonymous]
I like the thought, that we wish balance and vote accordingly.

I'm not sure the evidence of the proposed bias supports the type of ev-psych responses being offered.

The only cases I'm aware of underdog bias actually mattering are of the Israel-Palestine type, not the Zug-Urk type. I-P poses no significant costs or benefit to the individual. Z-U poses tremendous costs or benefits to the individual. I don't imagine I-P type support decisions meaningfully affect reproductive success. Unless there's evidence that people still side with the underdog when it really costs them something, these ev-psych explanations seem to be... (read more)

5Jack
It doesn't just raise the question, it begs the question.
0Psychohistorian
Not really. 'Why does "underdog beats overdog" make a more interesting story that "overdog beats underdog" 'is a very different question from 'Why do we tend to side with the underdog when no costs are imposed on us?' Providing an alternative mechanism but not being able to fully explain its causes is hardly begging the question.
3Jack
Yes, the following i true "Why does "underdog beats overdog" make a more interesting story that "overdog beats underdog" 'is a very different question from 'Why do we tend to side with the underdog when no costs are imposed on us?' But your distinguishing two questions that weren't distinguished in either your comment or the post. The post asks why we tend to support the underdog. In the initial post the "supporting" consists of verbally saying "I support x" and then, later, identifying with the underdog in a story (i.e. Leonidas and bin Laden). You come back and say well look, maybe our selection of fiction leads us to think the underdog must be the good guy. But as I understood the initial question part of what we were seeking to learn was why we identify with the underdogs in stories. I take identifying with a fictional character to be equivalent to "siding with them without an imposed cost". So as I took the initial question your explanation for some of the underdog phenomenon merely attributes the cause to other parts of the phenomenon and fails to get at the root cause. Indeed, nearly every significant pattern in human behavior will have been documented in fiction, so of one could claim fictional availability bias about lots of things (mating rituals, morality, language use, etc.) but its all chicken and egg until you explain HOW THE FICTION GOT THAT WAY.
2thomblake
That's not begging the question. I don't see an argument being made with the conclusion as a premise. Perhaps you could be more explicit and concise? That "underdog beats overdog" makes an interesting story does not require that we side with the underdog. Just like "dog bites man" is less interesting than "man bites dog", regardless of who you side with.
0Jack
1. We side with the underdog. 1A. Polling on Israel-Palestine shows a shift in support given to the side that appears to be the underdog. 1B. Despite being evil we sort of think bin Laden is cool for taking on the US by himself. 1C. When we tell stories we tend to identify with and root for the underdog, i.e. Leonidas. When we want to know why (1.) I take it that any explanation that includes any of the sub-premises is question begging. Psychohistorian's response was that (1) is caused by the fact that in our stories the underdog is always the side we identify with and root for and this leads us to assume that the underdog is the "good side" and therefore side with the underdog. But as I took the question (1) part of what needed explaining was underdog identification in stories. This mess about what makes an "interesting story" was added after the initial comment and it confuses things. As I took the initial comment the only evidence being presented was the vast collection of pro-underdog stories and the dearth of pro-overdog stories and this was taken to be sufficient lead us to side with the underdog. I don't think this response is especially helpful because part of our reason for even thinking that there is an underdog bias is the fiction. Throwing in "interesting" adds another step to the argument and this version might not be begging the question anymore (though I'm not convinced of that either).
1thomblake
He might have meant 'begs the question' in the colloquial sense, which people really should stop doing.
3Jack
If I had meant this the comment would have made no sense.
[-]gjm80

Here's another explanation (a bit like taw's). I don't find it terribly convincing either, but I don't see an outright refutation.

Suppose you have kin, or others whose welfare (in the relevant senses) is correlated with yours. Obviously you'll tend to help them. How much, and how urgently? Much more when they're in worse trouble. (As taw says, when they're in a strong position they don't need your help, so most likely your own more direct interests matter more to you.) So there's value in having a mechanism that makes you care more about people you'd have ... (read more)

It's worth bearing in mind how people actually behave: if Zug is so powerful and vengeful that opposing him would be flat-out suicide, people don't. They may quietly nurse grudges and wait for misfortune to befall him, but they don't take overt action. Siding with Urk is a lot more understandable once we note that people only actually do it when it is reasonably safe to do so.

Partly our empathy circuits. Humans like to help - and like to be seen to be helping. The underdog is the party that most obviously needs assistance.

Might I add Dunbar's number to this? Large powerful groups have a tendency to split (especially hunter-gatherer ones). And once they split, they often become each other's enemies. Oftentimes, it's better for the individual to be the underdog when the underdog is a group that is less likely to split.

Alternatively, let's ponder this situation: you're part of a group, a single one of many possible groups. Your group has interests in supporting the weaker groups if your group wishes to survive (of course you may be okay with having your group absorbed into ano... (read more)

1thomblake
Yes, I think this is exactly the sort of truel situation that is talked about elsewhere.

How about Terror Management Theory? By supporting a cause that is probably going to win anyway, we gain little. But by supporting an unlikely cause such as Leonidas at Thermopylae, there is an increased possibility that if we succeed our accomplishments will live on past us, because it is so incredible. In this way, we would become immortal. One prediction from this explanation is that the greater the disparity between the underdog and the overdog the larger the preference towards the underdog will be, which seems to be backed up empirically (see the increased preference for Slovenia vs. Sweden in the referenced study).

1Jack
"By supporting a cause that is probably going to win anyway, we gain little. But by supporting an unlikely cause such as Leonidas at Thermopylae, there is an increased possibility that if we succeed our accomplishments will live on past us, because it is so incredible." There are a couple problems with this. First, we might join Leonidas on those grounds but why would we root for him on those grounds? We're not going to be remembered that way. Second, if one wants to be remembered one is probably best off just being on the side of the winners. Winners write history. Finally, this could explain the motivation of the underdog but I don't think it explains the way we seem to be wired to root for the underdog (either biologically or culturally).

My first thought was to assume it was part of the whole alpha-male dominance thing. Any male that wants to achieve the status of alpha-male starts out in a position of being an underdog and facing an entrenched opposition with all of the advantages of resources.

But, of course, alpha-males outperform when it comes to breeding success and so most genes are descended from males that have confronted this situation, strove against "impossible" odds, and ultimately won.

Of course, if this is the explanation, then one would expect there to be a strong difference in how males and females react to the appearance of an underdog.

The proffered explanations seem plausible. What about with ideas though? I think it's social signaling: 'Look how clever and independent and different I am, that I can adopt this minority viewpoint and justify it.'

(Kind of like Zahavi's handicap principle.)

EDIT: It appears I largely stole this variant on signaling strategy from http://www.overcomingbias.com/2008/12/showoff-bias.html . Oh well.

Your mention of signaling gives me an idea.

What if the mechanism isn't designed to actually support the underdog, but to signal a tendency to support the underdog?

In a world where everyone supports the likely winner, Zug doesn't need to promise anyone anything to keep them on his side. But if one person suddenly develops a tendency to support the underdog, then Zug has to keep him loyal by promising him extra rewards.

The best possible case is one where you end up on Zug's side, but only after vacillating for so long that Zug is terrified you're going to side with Urk and promises everything in his power to win you over. And the only way to terrify Zug that way is to actually side with Urk sometimes.

It seems that supporting an underdog is a more impressive act - it suggests more confidence in your own abilities, and your ability to withstand retribution from the overdog. I'm not sure we do actually support the underdog more when a costly act is required, but we probably try to pretend to support the underdog when doing so is cheap, so we can look more impressive.

4SoullessAutomaton
In other words, if Zug believes you to be the kind of agent who will make the naively rational decision to side with him, he will not reward you. You then side with Zug, because it makes more sense. However, if Zug believes you to be the kind of agent who will irrationally oppose him unless bribed, he will reward you. You then side with Zug, because it makes more sense. This seems to be another problem of precommitment.
5Eliezer Yudkowsky
While my own decision theory has no need of precommitment, it's interesting to consider that genes have no trouble with precommitments; they just make us want to do it that way. The urge to revenge, for example, can be considered as the genes making a sort of believable and true precommitment; you don't reconsider afterward, once you get the benefits, because - thanks to the genes - it's what you want. The genes don't have quite the same calculus as an inconsistent classical decision theorist who knows beforehand that they want to precommit early but will want to reconsider later.
1loqi
But Zug probably doesn't care about just one person. Doesn't the underdog bias still require a way to "get off the ground" in this scenario? Siding with Urk initially flies in the face of individual selection.
5Eliezer Yudkowsky
Zug can be only slightly more powerful than Urk to start with, and then as more individuals have the adaptation, the power difference it's willing to confront will scale. I.e. this sounds like it could evolve incrementally.
1loqi
Ah, makes sense. The modern bias seems specifically connected to major differences, but that doesn't exclude milder origins.
5SoullessAutomaton
Social signalling explains almost everything and predicts little. By law of parsimony, supporting underdog ideas seems much likelier to me as a special case of the general tendency Yvain is considering.
9AnnaSalamon
In this case, the social signaling interpretation predicts a discrepancy between peoples' expressed preferences in distant situations, and peoples' felt responses in situations where they can act. We can acquire evidence for or against the social signaling interpretation by e.g. taking an "underdog" scene, where a popular kid fights with a lone unpopular kid, and having two randomized groups of kids (both strangers to the fighters): (a) actually see the fight, as if by accident, nearby where they can in principle intercede; or (b) watch video footage of the fight, as a distant event that happened long ago and that they are being asked to comment on. Watch the Eckman expressions of the kids in each group, and see if the tendency to empathize with the underdog is stronger when signaling is the only issue (for group (b)) than when action is also a possibility (for group (a)). A single experiment of this sort wouldn't be decisive, but with enough variations it might.
1cousin_it
Your experiment wouldn't convince me at all because the video vs reality distinction could confound it any number of ways. That said, I upvoted you because no one else here has even proposed a test.

There are a log of good thoughts in these comments, but they are scattered. I can see value in someone collecting them into an organized summary of the plausible arguments on this topic.

I don't think it is necessarily true that merely by joining the faction most likely to win you will share in the spoils of victory. Leaders distribute rewards based on seniority more than support. In a close contest, you would likely be courted heavily by both sides, providing a temporary boost in status, but that would disappear once the conflict is over. You will have not earned the trust of the winner since your allegiance was in doubt. I don't think there is much to gain by joining the larger side late; you'll be on the bottom of society once the d... (read more)

1AspiringKnitter
That should predict this bias to be stronger in men. After all, more partners, past a certain point, isn't really helpful to women's reproductive success, plus I'd be surprised if men sought courageous mates (if they go and get themselves killed before your baby is born...). So, is this bias stronger in men?

In a confrontation between two parties, it's more likely that the stronger one will pose the greater threat to you. By supporting the underdog and hoping for a fluke victory, you're increasing your own survival odds. It seems we're probably evolved to seek parity -- where we then have the best chance of dominating -- instead of seeking dominant leaders and siding with them, which is a far more complex and less certain process.

Am I missing something? Also, it would be interesting to see whether females and males have the same reactions toward the overdog.

3steven0461
The problem with things like "seeking parity" is that your actions play only a small part in determining the outcome of the conflict, whereas your actions play a much larger part in determining consequences to your post-conflict status.
7Eliezer Yudkowsky
Not if others also side with the underdog, and punish those who side with the overdog - perhaps by viewing them as "craven" or "toadying" and treating them accordingly. People seem to have an odd respect for supervillains, but do we respect the henchmen?
0nescius
I also wonder about possible sex differences. Some information is available: The Appeal Of The Underdog:

This problem seems even to afflict Mencius Moldbug. His ideology of formalism seems to be based on ensuring absolute unquestionable authority in order to avoid any violence (whether used to overthrow an authority or cement the hold of an existing one). At the same time he tries to base the appeal of his reactionary narrative by pointing highlighting how reactionaries are "those who lost" (in the terms of William Appleman Williams, whom Mencius would rather not mention) and the strong horse is universalism/antinomianism.

1PhilGoetz
What does that mean? The whole clause. And I don't understand why you equate universalism with antinomianism.
4Joe
Perhaps you figured this out since April, but the quoted clause makes sense in the context of Mencius' particular use of the terms "universalism" (roughly: what everyone in polite society believes these days in the West) which he categorizes as "antinomian", roughly: opposed to natural law.

Depending on the group size, the underdog might not be the underdog anymore with your support.

If it's a small group thing (or you have significant power) it is likely that you can determine which side wins.

The underdogs may have more at stake than the winners, and would be willing to give more in return for help. If Bob steals half of Fred's bananas every day, Bob gets to be a little better fed, and Fred dies.

If you help Fred out, he owes you his life, but Bob doesn't care nearly as much if it just means he has to go back to eating only his own bananas (that or you kill him).

If you choose to help Bob, your help isn't worth anything since he had it under control anyway.

2orthonormal
I think this instinct may in fact be evolutionarily optimized for conflicts between individuals; in most group conflicts in the ancestral environment, you probably already belong to one of the sides. But yes, it does seem to generalize too readily to conflicts where you personally wouldn't sway the balance. EDIT: How could we test any of the above theories? My theory seems to predict that describing the conflict as "one single entity versus another" (and triggering modes of thought optimized for third parties to single combat) will give a stronger underdog bias than describing a collection of entities on each side (with one collection much larger than the other).

Theory: supporting the underdog is a relatively costless longshot bet. Prediction: it will primarily occur in situations when opposing the overdog (verbally) can be done with impunity or secretly.

Overdog wins: no real consequences.

Underdog wins: "I supported you from the beginning! Can I be your trusted lieutenant?"

No one supports the underdog if they're a member, or a fan, of the overdog – only the unaffiliated are free to root for the underdog.

[-]Roko00

"By comparison, my informal experiments trying to teach people relevant facts about the region's history changed opinion approximately zero percent."

ROFL... Maybe you're trying with people who are either too emotionally involved or not clever enough?

1JulianMorrison
OK, what's YOUR position, and how much do you know? Then Yvain can dump historical facts on you, and we'll see how far you shift and in what direction.
4Roko
So, my position is: * Israel/Palestine is a significant global risk. Their squabling and fundamentalism could easily escalate to kill us all * Therefore, I am for peace in the Middle east irrespective of which faction gains most through that peace. This is quite a utilitarian position. But that isn't much of a problem for me as my emotional involvement is pretty low. I can afford to be cool and calculating about this one. What do I know? Mostly facts gained through casual Wikipedia'ing. * Israel is more competent than the Arabs, again and again they have proved to be the side with the most intelligence and military effectiveness. E.g. Yom-Kippur, Osiraq, etc. * That does not mean that Israel are all nice guys. * Nor does it mean that the Arab nations are nice guys * For me, living in an Arab country would be hell. They disvalue freedom, equality, rational secular enlightenment values, knowledge - basically everything I stand for. I am therefore weakly incentivized to make sure that the Arabic/Islamic culture complex doesn't get too powerful. * Israeli secret services etc are creepy. They kidnap people. Not cool. But overall this seems to be balanced by the fact that Israel contains a lot of people I would probably like - people who share my values.
2loqi
This is indeed a pretty utilitarian position. I think the objection you're likely to run into is that by evaluating the situation purely in terms of the present, it sweeps historic precedents under the rug. Put another way, the "this conflict represents a risk, let's just cool it" argument can just as easily be made by any aggressor directly after initiating the conflict.
3Eliezer Yudkowsky
Yup. If you don't punish aggressors and just demand "peace at any price" once the war starts, that peace sure won't last long.
2Roko
If I expected the current geopolitical situation to continue for a long time, I would agree. But neither of us do; we both place a high probability on either FAI or uFAI within 100 years; the top priority is to just survive that long. Also, even if you assign some probability to no singularity any time soon, the expected rewards for a situation where there is a singularity soon are higher, as you get to live for a lot longer, so you should care more about that possibility.
1JulianMorrison
(I yesterday heard someone who ought to know say AI at human level, and not provably friendly, in 16 years. Yes my jaw hit the floor too.) I hadn't thought of the "park it, we have bigger problems", or "park it, Omega will fix it" approach, but it might make sense. That raises the question, and I hope it's not treading to far into off-LW-topic: to what extent ought a reasoning person act as if they expected a gradual and incremental change in the status quo, and to what extent ought their planning to be dominated by expectation of large disruptions in the near future?
1Roko
Well, if you actually believe the kinds of predictions that say the singularity is coming within your lifetime, you should expect the status quo to change. If you don't, then I'd be interested to hear your argument as to why not.
2JulianMorrison
The question I was struggling to articulate was more like: should I give credence to my own beliefs? How much? And how to deal with instinct that doesn't want to put AI and postmen in the same category of "real"?
3Roko
If you don't give credence to them ... Then they're not your beliefs! If you go to Transhumanist events, profess to believe that a singularity is likely in 20 years, but then when someone extracts concrete actions you should take in your own life that would be advantageous if and only if the singularity hypothesis is true, and you feel hesitant, then you don't really believe it.
1Eliezer Yudkowsky
Who on Earth do you think ought to know that?
2JulianMorrison
Shane Legg, who was at London LW meetup.
0Roko
Shane expressed this opinion to me too. I think that he needs to be more probabilistic with his predictions, i.e. give a probability distribution. He didn't adequately answer all of my objections about why neuro-inspired ai will arrive so soon.
3JulianMorrison
From what he explained, the job of reverse engineering a biological mind is looking much easier than expected - there's no need to grovel around at the level of single neurons, since the functional units are bunches of neurons, and they implement algorithms that are recognizable from conventional AI.
1Eliezer Yudkowsky
This sounds like a statement made by some hopeful neuromodeler looking for funding rather than a known truth of science.
3JulianMorrison
You want the details? Ask the pirate, not the parrot. Rawwrk. Pieces of eight.
0Roko
Yes, but when we got into detail about how this might work and what the difficulties might be, I had some significant objections that weren't answered.
1whpearson
I think it would make an interesting group effort to try and estimate the speed of neuro research to get some idea of how fast we can expect neuro-inspired AI. I'm going to try and figure out the number of researcher working on figuring out the algorithms for long term changes to neural organisation (LTP, neuro plasticity and neuro genesis). I get the feeling it is a lot less than those working on figuring out short term functionality, but I'm not an expert and not submerged in the field.
1Nick_Tarleton
Please do; this sounds extremely valuable.
0Roko
I would do this with shane: but I think it might be off topic at the moment.
1Eliezer Yudkowsky
Ja, going off-topic.
2JulianMorrison
Are you sure you're not playing "a deeply wise person doesn't pick sides, but scolds both for fighting"?
1Roko
Maybe. Though, I am not consciously doing this. See my above response to EY.
[-][anonymous]-30

Maybe we just don't like overdogs, bullies in the schoolyard. They are randomly dangerous.

-1[anonymous]
Yvain suggests a bias towards underdogs, I am suggesting a bias away from overdogs. Why am I being voted down?
3Eliezer Yudkowsky
You don't understand evolutionary psychology. You also don't know how to support an argument. "Just don't like"? That is precisely that which is to be explained. Nor are they randomly dangerous. Look, I'm sorry, but you're on the wrong blog here. Read if you like, of course, but I don't think you're ready to be commenting. This is why you are often voted down. Sorry.
4Hans
I read your comment and I immediately wanted to vote up Marshall's original comment. After all, he's the underdog being criticized and chased away by the founder and administrator of this blog. In the end, I didn't, probably for equally irrational reasons.
4Eliezer Yudkowsky
(Blinks.) I have to say, that frame on the whole problem had never occurred to me. No wonder online communities have such a hard time developing membranes.
5Paul Crowley
It's worse here, because for some reason when people like Marshall claim that "rationalist" means "treats any old crap like it was a worthy contribution", people here are sufficiently wary of confirmation bias to take it more seriously than it deserves.
2Eliezer Yudkowsky
Yeah, I've noticed. If I were to make a list of the top 3 rationalist errors, they'd be overconfidence, overcomplication, and underconfidence. Either that or there's some kind of ancient echo of protecting the underdog in effort to keep the tribal power balance.
-4[anonymous]
You are actually being a little bit of a bully yourself - Eliezer. I would have thought that dialogues and conversations were an important part of being a rationalist. And disagreement. I would not have thought that underexplained decrees and conformity played so high a role. But I have your word for it, that I am wrong. So be it. From now on I will always smile, when I hear the word rationalist.
8Psychohistorian
He didn't give you his word that you are wrong. He stated that your claim was not rigorous and that crucial parts of it required supporting evidence that you failed to provide. He also claimed that you do not understand evolutionary psychology. (Edit) You have provided no evidence to dispute this claim. Of course you are probably not making an evpsych argument, so this comment is probably not necessary, but if it's wrong and you are, you might consider rebutting it. Rational responses to this would include providing evidence in support of your claim or explaining how a this predisposition might form. "Maybe we just don't like overdogs" explains exactly nothing, except that we don't like overdogs, which has already been stated. Conversations are important, but making statements with no evidenciary claim and, well, no claim that adds meaning, are not.
3nescius
One could interpret the phrase to suggest that focus in this forum may be being misleadingly directed towards the idea of support of underdogs rather than opposition of overdogs (Vandello's "top dog"s), to which underdog support may be secondary. The phenomena are not inversions of each other. At least, I haven't taken dislike of overdogs as being granted by the assertions of tendency for support of underdogs. Perspective changes are often useful. This interpretable alternate notion may lead somewhere, while conflict resulting from an ungenerous (if accurate) understanding may not always be as fruitful as this particular incident appears to (heartwarmingly) be. The linked paper says: [edit: Note further discussion of "schadenfreude" on page 1614.] My opinion of overdog spite, without having conducted or surveyed studies: I think it exists and has a not insubstantial effect on underdog support, but my guess is that the primary factor or factors in underdog support are not dependent on it. Thanks anyway, Marshall, for the idea, whether you intended it. I'll keep it nearby as I consider underdog support.
1[anonymous]
Yes - that was one of my points.
-1[anonymous]
Thanks for trying to explain the rules of the game to me. I have not at any point equated rationality with the scientific model. Scientific psychology is trivial (and 20% wrong) and inapplicable to living. Trying to stumble on happiness after reading Stumbling on Happiness if you don't believe me. I do not think the long list of just so stories from the comments with various tailored scripts is evidence of anything other than following the bandwagon. My story is taken from the schoolyard. My evidence is present to everyone, who has been to school and seen bullies at work. The evidence of your own eyes and your own experience. But this is not your language-game. Fair enough. And stupid of me to try to extend the rules. Incommenserability is the name of that game. As I said in a comment under "Truels". You have the option of metacommenting and being shot, or you can run away. I regret that no-one criticises Eliezers highhandedness - that does not speak well of your community. And it puts to shame all thoughts of FRIENDLINESS under his tutelage.