A "truel" is something like a duel, but among three gunmen. Martin Gardner popularized a puzzle based on this scenario, and there are many variants of the puzzle which mathematicians and game theorists have analyzed.
The optimal strategy varies with the details of the scenario, of course. One take-away from the analyses is that it is often disadvantageous to be very skillful. A very skillful gunman is a high-priority target.
The environment of evolutionary adaptedness undoubtedly contained multiplayer social games. If some of these games had a truel-like structure, they may have rewarded mediocrity. This might be an explanation of psychological phenomena like "fear of success" and "choking under pressure".
Robin Hanson has mentioned that there are costs to "truth-seeking". One of the example costs might be convincingly declaring "I believe in God" in order to be accepted into a religious community. I think truels are a game-theoretic structure that suggests that there are costs to (short-sighted) "winning", just as there are costs to "truth-seeking".
How can you identify truel-like situations? What should you (a rationalist) do if you might be in a truel-like situation?
I found the post interesting... except for this penultimate paragraph; I don't think there's a good analogy here. An evolutionary motive for "choking" or signaling choking is an interesting enough observation on its own.
LW is on refining the art of thinking; there's no need to strain for the segues.
(To be specific about where the analogy is strained: One question is about whether common human goals are likely to conflict with epistemic rationality; the other question is about signaling short-term failure as a road to long-term success under clearly defined criteria. Standard instrumental versus terminal goals, which is not at all as thorny as alleged instrumental epistemic stupidity.)
Thanks. I'll try to avoid strained segues in the future.