the source of differential reproduction.
Ok, so let me state my argument maximally clearly. People who can convince others that they make good, trustworthy allies will find it easier to make alliances. This is beyond reasonable doubt - it is why we are so concerned with attributing motives to people, with analyzing people's characters, with sorting people into ethical categories (she is a liar, he is brave, etc).
If there is a cheap but not completely free way to signal that you have the disposition of a moral, trustworthy person, for example by rooting for the underdog in faraway conflicts, then we should expect people to have the trait of displaying that signal.
All that remains is to conclude that rooting for the underdog rather then the overdog really does signal to others that you are a good person, who is more likely than average to side with he who has the moral high ground rather than he who has most power. In the case that the human brain could lie perfectly, rooting for the underdog in a faraway conflict would carry no information about what you will do in a near conflict. But the human brain somehow didn't manage to be a maximally efficient lying machievelli, so he who displays moral opinions about faraway conflicts presumably behaves at least a little more ethically in near conflicts.
The mechanism here is that there is a weak connection between what we say about Israel/Palestine [the faraway conflict], and how we behave in our personal lives [the nearby conflict]. My experiences with people who are e.g. pro-Palestine bears this out - they tend to be that Gaurdian-reading almost hippie type, who might be a little fuzzy headed but are probably more likely to help a stranger. This weak connection means that you can do inference about someone's behavior in a near situation by what they say about a far situation. The survival advantage of claiming to support the underdog follows.
Another possible mechanism is that by supporting the underdog, you put yourself slightly at risk [the overdog won't like you] so this is a costly signal of strength. This I find a little less convincing, but still worth considering.
"If there is a cheap but not completely free way to signal that you have the disposition of a moral, trustworthy person, for example by rooting for the underdog in faraway conflicts, then we should expect people to have the trait of displaying that signal."
During the time span when the underdog tendency was presumably evolving, I doubt that there was any awareness of far-away conflicts that didn't touch the observer. Awareness of geographically distant conflicts is a relatively modern phenomenon.
Here is an alternative explanation. The inclinatio...
Yvain can’t make head nor tails of the apparently near universal human tendency to root for the underdog. [Read Yvain’s post before going any further]..
He uses the following plausible-sounding story from a small hunter-gatherer tribe in our Era of Evolutionary Adaptedness to illustrate why support for the underdog seems to be an antiprediction of the standard theory of human evolutionary psychology:
Yvain cites an experiment where people supported either Israel or Palestine depending on who they saw as the underdog. This seems to contradict the claim that the human mind is well adapted to its EEA.
A lot of people tried to use the “truel” situation as an explanation: in a game of three players, it is rational for the weaker two to team up against the stronger one. But the choice of which faction to join is not a truel between three approximately equal players: as an individual you will have almost no impact upon which faction wins, and if you join the winning side you won’t necessarily be next on the menu: you will have about as much chance as anyone else in Zug’s faction of doing well if there is another mini-war. People who proffered this explanation are guilty of not being more surprised by fiction than reality. To start with, if this theory were correct, we would expect to see soldiers defecting away from the winning side in the closing stages of a war... which, to my knowledge, is the opposite of what happens.
SoulessAutomaton comes closest to the truth when he makes the following statement:
Yes! Draw Distinctions!
I thought about what the answer to Yvain’s puzzle was before reading the comments – and decided that Robin’s Near/Far distinction is the answer.
When you put people in a social-science experiment room and tell them, in the abstract, about the Isreal/Palestine conflict, they are in “far” mode. This situation is totally unlike having to choose which side to join in an actual fight – where your brain goes into “near” mode, and you quickly (I predict) join the likely victors. This explains the apparent contradiction between the Israel experiment and the situation in a real fight between Zug’s faction and Urk’s faction.
In a situation where there is an extremely unbalanced conflict that you are “distant” from, there are various reasons I can think of for supporting the underdog: but the common theme is that when the mind is in “far” mode, its primary purpose is to signal how nice it is, rather than to actually acquire resources. Why do we want to signal to others that we are nice people? We do this because they are more likely to cooperate with us and trust us! If evolution built a cave-man who went around telling other cave-men what a selfish bastard he was... well, that cave-man wouldn't last long.
When people support, for example, Palestine, they don't say "I support Palestine because it is the underdog", they say "I support Palestine because they are the party with the ethical high ground, they are in the right, Israel is in the wrong". In doing so, they have signalled that they support people for ethical reasons rather than self-interested reasons. Someone who is guided by ethical principles rather than self-interest makes a better ally. Conversely, someone who supports the stronger side signalls that they are more self-interested and less concerned with ethical considerations. Admittedly, this is a signal that you can fake to some extent: there is probably a tradeoff between the probability that the winning side will punish you, and the value that supporting someone for ethical reasons carries. When the conflict is very close, the probability of you becoming involved makes the signal too expensive. When the conflict is far, the signal is almost (but not quite) free.
You also put yourself in a better bargaining position for when you meet the victorious side: you can complain that they don't really deserve all their conquest-acquired wealth because they stole it anyway. In a world where people genuinely think that they are nicer than they really are (which is, by the way, the world of humans), being able to frame someone as being the "bad guy" puts you in a position of strength when negotiating. They might make concessions to preserve their self-image. In a world where you can't lie perfectly, preserving your self-image as a nice person or a nice tribe is worth making some concessions for.
All that remains to explain is what situation in our evolutionary past corresponds to hearing about a faraway conflict (Like Israel/Palestine for westerners who don’t live there or have any true interest). This I am not sure about: perhaps it would be like hearing of a distant battle between two tribes? Or a conflict between two factions of your tribe, which occurs in such a way that you cannot take sides?
My explanation makes the prediction that if you performed a social-science experiment where people felt sufficiently close to the conflict to be personally involved, they would support the likely winner. This might involve making people very frightened and thus not pass ethics committee approval, though.
The only good experience I have with “near” tribal conflicts is my experiences at school; whenever some poor underdog was being bullied, I felt compelled to join in with the bullying, in exactly the same “automatic” way that I feel compelled to support the underdog in Far situations. I just couldn’t help myself.
Hat-tip to Yvain for admitting he couldn’t explain this. The path to knowledge is paved with grudging admissions of your ignorance.