Posts

Sorted by New

Wiki Contributions

Comments

It may be true that well-kept gardens die from activism, but it's also the case that moderation can kill communities.

Any community that really needs to question its moderators, that really seriously has abusive moderators, is probably not worth saving. But this is more accused than realized, so far as I can see.

There speaks the voice of limited experience. Or perhaps LiveJournal, Reddit, Google+ and Facebook really are not worth saving?

I've seen enough discussion forums killed by abusive moderators that I look carefully before signing up for anything these days. When I write a lengthy response, like this, I post it on my own site rather than face the possibility that it will be silently deleted for disagreeing with a moderator.

However, I've also been a moderator, and I've seen situations where moderation was desperately needed. In my experience on both sides of the issue, there are some basic criteria for moderation that need to be met to avoid abuse:

  • Moderation needs to be visible. Comments that are removed should be replaced with a placeholder saying so, and not simply deleted. Otherwise there will be accusations of repeated unfair deletion, and any act of moderation will quickly snowball into an argument about how much censorship is occurring, and then an argument about whether that argument is being censored, and so on until everyone leaves the site.
  • Moderation needs to be accountable. Moderators must have individual accounts, and moderation actions need to be associated with individual accounts. Without this, it's pretty much impossible to identify an abusive moderator. I recently got banned from a subreddit for asking which rule I had broken with a previous posting, and there was no way to find out who had banned me.
  • Moderation needs to be consistent. There needs to be a description of what the criteria for moderation actually are. It doesn't need to be legalistic and all-encompassing, and it may be subject to change, but it needs to exist. Some people feel that actually writing down the criteria encourages people to argue about them. The alternative, though, is that person A gets banned or censored for doing something that person B does all the time; that leads to much worse ill-will and ultimately is worse for the community.
  • Moderation rules need to apply to the moderators. A special case of the above, but it deserves highlighting. Few things are more infuriating than being banned by a moderator for doing something that the person doing the banning does all the time. Once this kind of moderation starts happening (e.g. Gizmodo), the atmosphere becomes extremely toxic.
  • Moderation needs an appeals process. There are abusive power-tripping assholes out there, and they love to find their way onto forums and become moderators. You need a mechanism for identifying any who find their way into your forum. Having some sort of appeals process is that mechanism. Ideally appeals should be resolved by someone who isn't part of the moderation team. Failing that, they should be resolved by someone other than the person being complained about, obviously.

It also helps if the moderation activity can be openly discussed in a partitioned area of the site. There will be desire to discuss moderation policy, so plan ahead and have a space where people can do so without derailing other threads. That way, you can also redirect meta-discussion into the moderation discussion area to avoid thread derailment, without making the problem worse.

(Also posted at my web site)