Good_Burning_Plastic comments on Turning the Technical Crank - Less Wrong Discussion
You are viewing a comment permalink. View the original post to see all comments and the full post content.
You are viewing a comment permalink. View the original post to see all comments and the full post content.
Comments (134)
Negativity in the discussion was mentioned. Not sure how important this is compared with other reasons.
Also, some people post both LW-type content and non-LW-type content. The latter does not belong to LW, so they create a separate blog. When the blog attracts its own community of readers, they may prefer to also post the LW-type content here, especially when the boundaries are not clear. (Some of them do repost the LW-type content here afterwards.)
In my opinion, the essence of the problem is that people instinctively play status games all the time. Even when they say that would prefer to do something else instead. It is hard to abandon the game, when even "saying that you would prefer to stop playing the game" can be used as a successful move within the game. Actually, denying that you are playing the game is almost a requirement in most situations; and accusing other people that they are playing the game is an attack move within the game. The game goes on automatically; whatever you do, you get or lose a few points, and other people see it. If you say "I am not playing the game", but other people see you winning points, and they also want a few points for themselves.
And then, we have the instinct that status is connected with various things, especially with the ability to hurt other people and defend yourself successfully from being hurt. Oh, we are civilized people, so in most situations we avoid the worst forms of violence, but in every situation there is a permissible range: maybe only verbal attacks, maybe only passive aggressive behavior, but some of us are very good at using what we can. Seeing that someone gained too many points, without the ability to defend themselves and attack their enemies, provokes an attack. Not necessarily from someone who wants to replace the target, but simply from someone who feels that the difference of points between them and the target has become disproportionally large compared with their own estimate of how it should be.
How it looks from outside (among civilized people who wouldn't admit playing the game) is illustrated here. Essentially, whenever you do something that is "too good" (something that brings you much more points than you "should have" according to your perceived ability to attack and defend yourself), many people will feel the urge to criticize you and your work, to alleviate the difference. From inside, I guess they will either convince themselves that the work is actually not good, or imagine some dangerous things you are totally going to do with your newly gained points (and see themselves as heroes who prevented this danger), or simply deny that they are attacking you.
This can be very exhausting to a person who wants to focus on creating good content, but doesn't want to spend their time defending themselves from attacks. The usual reaction is that the person stops producing the good content, and the status balance is maintained. Which is quite bad for us, who want to consume the good content.
Another option is to retreat to a fortress, where the defense is much easier. Such as Facebook, where you can block the attackers in a few seconds, and they usually won't create another account only to bother you (and even if they do, you can still set your messages visible to only your friends). If you are willing to solve the related technical problems, you can use your own blog.
So, the question is: can we do anything to prevent good authors from having to retreat to their own fortresses (or not writing / not publishing anymore) after they gain "too much" points for doing what we want them to do? What kind of platform would achieve that?
There is a standard solution, and most people call it "censorship". You create a place where the authors can publish, and where all attacks are removed. Preferably by a third-party moderator, so the authors don't even see them, and don't have to waste their own time deleting them.
I can imagine how most people would react to this proposal. No, we can't remove all negative feedback; we need to have a way how to tell genuinely bad authors that their work honestly sucks! Otherwise the stupidity will prevail! Sure... but the whole problem is that we are running on a corrupted hardware, so when the situation comes and our status-regulation emotion kicks in, we will start believing that the author is genuinely bad, the work genuinely sucks, and there is a very real and very urgent danger of genuinely horrible things happening unless the author is provided negative feedback as strongly as possible. :(
("Oh no, Eliezer has an opinion on quantum physics that only a few experts agree with, but other experts disagree! And he believes that Bayes' Theorem is super important, and the Bayes' Theorem really is important, but isn't as much imporant as he believes! And he once deleted Roko's Basilisk and provided a totally unsatisfying PR explanation! And he asks people to send him money! And he has multiple girlfriends! This is totally a cult, worse than scientology! They are going to spread wrong interpretations of quantum physics and then they will commit mass suicide! Someone think of the children! Don't read the Sequences! Don't read HPMoR! Tell everyone, and warn them about the danger! Write an article on RationalWiki, and Wikipedia, and your local news, and contact all skeptical organizations you know, and post on Facebook and Reddit! Someone stop this dangerous guy from having too much status!")
The proposal of "censorship" is value-neutral. There are authors who should be attacked; there are authors who shouldn't be; the proposed mechanism protects both equally. Making a mechanism that protects that and only that which should be protected is a FAI-complete problem. At some moment a human judgement has to be applied. At that moment, you should expect the known psychological forces to manifest.
Another option is to remove debates completely; then you avoid the accusations of censorship, but you also lose the potentially good comments. Sure, the people will comment on a different website, but that's okay -- such comments aren't linked to the criticized article as strongly as the comments directly below the article would be. (And you cannot prevent comments on a third-party website anyway.) Publishing a book is one way to do this; no one can write their comment into all copies of your book.
Yet another option is to make attacking costly: for example, you would be allowed to publish a critique of an article, but that critique itself would have to be a well-written article (preferably explaining and supporting their own position, not merely saying "X is wrong", so that they are now equally exposed to an attack) and have to be accepted by editors. Of course the editors are going to be accused of partiality; that's inevitable. (Replace the editors by a popular vote, then we need someone to decide who is an eligible voter, and we still have the status-regulation emotion urging people to upvote a critique that doesn't fulfill the criteria but is well-deserved anyway.)
"Only a few" are as committed to it as Eliezer is, but many many more consider it at least somewhat plausible.
I think the word you're looking for is "moderation".
It's one of those flexible words: I keep the discussion polite; you moderate; he censors.
They are usually called "irregular verbs" :-)