If it's worth saying, but not worth its own post (even in Discussion), then it goes here.
Notes for future OT posters:
1. Please add the 'open_thread' tag.
2. Check if there is an active Open Thread before posting a new one. (Immediately before; refresh the list-of-threads page before posting.)
3. Open Threads should be posted in Discussion, and not Main.
4. Open Threads should start on Monday, and end on Sunday.
I have this half-baked idea that trying to be rational by oneself is a slightly pathological condition. Humans are naturally social, and it would make sense to distribute cognition over several processors, so to speak. It would explain the tendencies I notice in relationships to polarize behavior - if my partner adopts the position that we should go on vacations as much as possible, I almost automatically tend to assume the role worrying about money, for example, and we then work out a balanced solution together. If each of us were to decide on our own, our opinions would be much less polarized.
I could totally see how it would make sense in groups that some members adopt some low probability beliefs, and that it would benefit the group overall.
Is there any merit to this idea? Considering the well known failures in group rationality, I wonder if this is something that has long been disproved.
Mercier & Sperber made a similar argument, commenting that e.g. things that seem like biases in the context of a single individual (such as confirmation bias) are actually beneficial for the decision-making of a group. An excerpt:
... (read more)