I am searching for an old LessWrong post about something like "Community takeover": The idea that a successful community (Rationalist/AI Risk) could attract people from outside, who would attempt to co-opt/hijack the agenda to focus on something else.

This is inspired by the suggestion by Carla Zoe Cremer (FHI) and Luke Kemp (CSER) that Existential Risk Studies should look into Wealth Inequality.

Thanks in advance

Søren

New Answer
New Comment

3 Answers sorted by

matto

150

This probably isn't the post you are looking for, however it describes a similar process: https://meaningness.com/geeks-mops-sociopaths

My guess is this is actually what the OP is looking for.

1Søren Elverlin
Yes, it was in fact this post. Thank you

Viliam

70

Political topics attract participants inclined to use the norms of mainstream political debate, risking a tipping point to lower quality discussion

Political topics elicit lower quality participation, holding the set of participants fixed. This is the thesis of "politics is the mind-killer".

Here's a separate effect: Political topics attract mind-killed participants. This can happen even when the initial participants are not mind-killed by the topic.

Of course at the same time, if we call ourselves rationalists, we had better be able to think rationally about topics which are typically mind killers. That's not to say the conversation has to happen here on LW, of course, but if politics intersects with a high-impact area, it's probably worth talking about within the community.

5Viliam
We need some rules on how to proceed when some of the participants believe that the others are being mind-killed, and those accused obviously object "no, I am just rationally telling it like it is, litany of Tarski, etc.". Obviously, both groups will insist that they are the rational ones. * Accusing others will likely escalate to: "you only call me mind-killed because you are mind-killed and therefore unable to rationally discuss my arguments." * On the other hand, people noticing they are getting mind-killed and abstaining from the debate voluntarily, may be leaving the floor to those who are worse at noticing this. If you notice that other participants of the debate are mind-killed, what can you do to avoid either of these two scenarios?
5Yitz
Simply have a hard rule not to engage with ad hominems (within the confines of those sorts of discussions; if you want to insult people on your own time go ahead, I guess). It’s hard for me to imagine a situation where telling the other “you are being mind-killed” helps bring the discussion forward, when instead you can debate on the object-level issue. If the other refuses to change their mind based on new data, than you can ask them to explain why they didn’t update their position (and they may have a totally reasonable response!), and if you can’t uncover what’s the cause of your disagreement, then problem solved. If not, you can now update on the other being a worse conversationalist than expected, and avoid future conversations (with the individual, if not the position they tried to represent), if you so wish.
5Viliam
Although in theory every argument could get a counter-argument (or a request for clarification, or whatever), in practice people seem to have some "time budget" they want to spend on a specific discussion, and they leave when they run out of time, and maybe get impatient when they see the time is running out and the topic is not yet solved. My impression is that people who seem to be mind-killed often write a lot. For example, there could be a discussion where some twenty people write one or two comments each, and then there are two or three people who write dozen comments each, defending a minority position. And of course, being in a minority position does not necessarily mean being wrong. It's just, when I see someone writing 10 comments, replying to 10 people who disagree with him, I feel like what's the point of joining this? If 10 people didn't convince him, what is the probability that I will? And I feel like the person has decided to spend much more time on the topic that I would like to. And "dude, you already got 10 explanations why your position is wrong, and yet you keep posting the same thing, maybe reflect on that first?" meta comment is... well, almost an ad hominem. (In terms of user interface, perhaps there should be some way to say "my objection to your position is already expressed in a comment written by a third party... so instead of writing my own comment which would essentially state the same thing, I would like to see you address that comment". Not just upvoting the comment, because that just means you liked it. But something like "your response to that comment is my condition for participating in this debate... because I would pretty much write the same thing".)
2Dagon
obXKCD: https://xkcd.com/386/ .  To the original point, sometimes it DOES matter, if it's a wrongness that's repeated enough to change the tone of communities that you otherwise enjoy.  You still need the skill of noticing that you're at an impasse, and not actually collaboratively seeking truth.  But then you have to find some middle ground between stepping away and continuing to have good-faith (on your part, it takes charity to attribute it to your opponent) object-level discussions.  I'd generally advice writing up a paragraph on your disagreement and stating that you're done with the topic for now.  Post it once per repetitious thread where this is happening.
2Dagon
The key is to recognize that it's not your job to fix the mind-killed state, just to notice and avoid it.  Drop the soldier mindset - you're not doing any good to claim that others are more mind-killed than you think you are.  Instead notice "this is a difficult topic to discuss in this forum, and I'm unable to update on any models presented this way".  Notice that you're not getting value, and evaluate if ANYONE is improving their models or increasing the correctness of their beliefs.  If not, probably best to keep it off that forum. You SHOULD then strive to fix it in yourself, and perhaps discuss or research it among people you trust to do so in ways compatible with your rationality.  Which probably isn't a fully-public forum.