If it's worth saying, but not worth its own post (even in Discussion), then it goes here.
Notes for future OT posters:
1. Please add the 'open_thread' tag.
2. Check if there is an active Open Thread before posting a new one. (Immediately before; refresh the list-of-threadspage before posting.)
3. Open Threads should be posted in Discussion, and not Main.
4. Open Threads should start on Monday, and end on Sunday.
It seems to me that Viliam's complaint is not that there would be more to talk about, but that more talk would be politicized.
I don't know for sure whether it was (I don't think I ever paid that much attention to the politics threads) but here's one way it could have been: suppose LW has few but very vocal neoreactionaries[1] and that most of the non-neoreactionaries are not very interested in talking about neoreaction[2]. If those few neoreactionaries arrange that every political discussion is packed with NRx stuff, then those political discussions will be annoying to everyone else because in order to read the bits they're interested in they have to wade through lots of NRx comments (and perhaps, though here they may have only themselves to blame, lots of anti-NRx responses).
[1] I think there is some evidence that this is actually so.
[2] This seems likely to be true, but I have no evidence. (I don't mean that most non-NRx people want never to talk about NRx; only that for most the optimal amount of NRx discussion is rather small.)
What about when you see a thread that you would want to read, but in which a few people obsessed with things you find uninteresting have posted hundreds of comments you don't want to read?
Of course it doesn't need to be neoreactionaries doing this. It could be social-justice types seizing every possible opportunity to point out heteronormative kyriarchal phallogocentric subtexts. It could be people terrified about AI risk turning every discussion of computers doing interesting things into debates about whether We Are All Doomed -- or people skeptical about AI risk complaining incessantly about how LW promotes paranoia about AI risk. It could be Christians proposing Jesus as the answer to every question, or atheists leaping on every case of suffering or successful scientific explanation to remind us that it's evidence against God. Etc., etc., etc.
It might be. Or it might be so only in the sense that for an alcoholic, having a glass of whisky is a significant opportunity to practice the discipline of self-control. (That is: in principle it might be but in practice the outcome might be almost certain to be bad.)
What do you mean by that? Do you mean that they're not interested in becoming lesswrong about the issue or that they only want to become lesswrong to the extent it doesn't involve being similar to those weird NRx's?