It seems clear there is some degree of truth to this. It may help to introduce agree/disagree voting not just for comments, but also for posts. This way people can express their (dis)agreement without also expressing their motivation for the post to (not) appear on the frontpage.
I pretty much agree. I just wrote a comment about why I believe LLMs are the future of rational discourse. (Publishing this comment as comfort-zone-expansion on getting downvoted, so feel free to downvote.)
To be more specific about why I agree: I think the quality/agreement axes on voting help a little bit, but when it comes to any sort of acrimonious topic where many users "have a dog in the fight", it's not enough. In the justice system we have this concept that a judge is supposed to recuse themselves in certain cases when they have a conflict of interest. A judge, someone who's trained much of their career for neutral rules-based judgement, still just isn't trusted to be neutral in certain cases. Now consider Less Wrong, a place where untrained users make decisions in just a few minutes (as opposed to a trial over multiple hours or days), without any sort of rules to go by, oftentimes in their afterhours when they're cognitively fatigued and less capable of System 2 overrides, and are susceptible to Asch Conformity type effects due to early exposure to the judgements of other users. There's a lot of content here about how to overcome your biases, which is great, but there simply isn't enough anti-bias firepower in the LW archives to consistently conquer that grandaddy of biases, myside bias. We aren't trained to consistently implement specific anti-myside bias techniques that are backed by RCTs and certified by Cochrane, not even close, and it's dangerous to overestimate our bias-fighting abilities.
LessWrong.com has long been heralded as a bastion of rational thought and high-minded discourse. The community prides itself on fostering rigorous intellectual discussions, grounded in the principles of rationality and Bayesian reasoning. However, upon closer examination, it becomes evident that the platform's structure and dynamics often undermine these noble aspirations, reducing what should be a forum for genuine inquiry into an elaborate popularity contest.
The Illusion of Rational Discourse
At its core, LessWrong is designed to reward content that garners community approval. The upvote system, intended to highlight quality contributions, inadvertently prioritizes posts that resonate with the majority. This mechanism is not unique to LessWrong; it is a staple of social media platforms across the internet. However, in a space dedicated to rationality, the consequences are particularly troubling.
The Tyranny of the Upvote
Upvotes, while seemingly innocuous, exert a profound influence on the nature of discourse. Posts that align with prevailing community sentiments are more likely to be upvoted, while those that challenge the status quo or present unpopular viewpoints often languish in obscurity. This dynamic encourages conformity and discourages the very skepticism and critical thinking that are hallmarks of true rational inquiry.
Moreover, the pursuit of upvotes can lead to intellectual echo chambers, where certain ideas and perspectives are continually reinforced, while dissenting voices are marginalized. In such an environment, the collective wisdom of the crowd can easily devolve into collective bias, stifling the diversity of thought that is essential for robust rational discourse.
The Pitfalls of Social Clout
LessWrong's emphasis on social clout further exacerbates the issue. Users with higher karma scores, accrued through upvotes, are often accorded greater credibility and influence within the community. While this might seem like a reasonable way to recognize valuable contributions, it also introduces a hierarchy that can skew discussions.
High-karma users may be more likely to have their posts and comments upvoted simply due to their established reputation, rather than the intrinsic merit of their ideas. Conversely, new or lower-karma users, regardless of the quality of their contributions, may struggle to gain visibility. This dynamic reinforces existing power structures and can discourage fresh perspectives from emerging.
The Quest for True Rationality
For LessWrong to truly fulfill its mission as a community dedicated to rationality, it must critically examine the ways in which its own systems and structures influence discourse. The platform must strive to create an environment where ideas are evaluated based on their merit, rather than their popularity. This could involve rethinking the upvote system, promoting diverse viewpoints, and actively encouraging critical scrutiny of widely-held beliefs.
In conclusion, while LessWrong aspires to be a haven for rational thought, it must confront the reality that its current model can inadvertently foster a popularity contest, rather than genuine intellectual engagement. By addressing these structural issues, the community can move closer to realizing its vision of a space where reason and evidence truly reign supreme.