You're looking at Less Wrong's discussion board. This includes all posts, including those that haven't been promoted to the front page yet. For more information, see About Less Wrong.

Multiheaded comments on New censorship: against hypothetical violence against identifiable people - Less Wrong Discussion

22 Post author: Eliezer_Yudkowsky 23 December 2012 09:00PM

You are viewing a comment permalink. View the original post to see all comments and the full post content.

Comments (457)

You are viewing a single comment's thread. Show more comments above.

Comment author: [deleted] 24 December 2012 12:27:38AM *  40 points [-]

I'm started to feel strongly uncomfortable about this, but I'm unsure if that's reasonable. Here's some arguments ITT that are concerning me:

Does advocating gun control, or increased taxes, count? They would count as violence is private actors did them, and talking about them makes them more likely (by states).

Violence is a very slippery concept. Perhaps it is not the best one to base mod rules on. (more at end)

We're losing Graham cred by being unwilling to discuss things that make us look bad.

This one is really disturbing to me. I don't like all the self-conscious talk about how we are percieved outside. Maybe we need to fork LW, to accomplish it, but I want to be able to discuss what's true and good without worrying about getting moderated. My post-rationality opinions have already diverged so far from the mainstream that I feel I can't talk about my interests in polite society. I don't want this here too.

If I see any mod action that could be destroyed by the truth, I will have to conclude that LW management is borked and needs to be forked. Until then I will put my trust in the authorities here.

Would my pro-piracy arguments be covered by this? What about my pro-coup d'etat ones?

Would it censor a discussion of, say, compelling an AI researcher by all means necessary to withhold their research from, say, the military?

The whole purpose of discussing such plans is to reduce uncertainty over their utility; you haven't proven that the utility gain of a plan turning out to be good must be less than the cost of discussing it in public.

Yeah seriously. What if violence is the right thing to do? (EDIT: Derp. Don't discuss it in public, (except for stuff like Konkvistador's piracy and reaction advocacy, which are supposed to be public))

My post was indeed inappropriate. I have used the "Delete" function on it.

This is important. If the poster in question agrees when it is pointed out that their post is stupid, go ahead and delete it. But if they disagree in some way that isn't simple defiance, please take a long look at why.

In general, two conclusions:

I support censorship, but only if it is based on the unaccountable personal opinion of a human. Anything else is too prone to lost purposes. If a serious rationalist (e.g. EY) seriously thinks about it and decides that some post has negative utility, I support its deletion. If some unintelligent rule like "no hypothetical violence" decides that a post is no good, why should I agree? Simple rules do not capture all the subtlety of our values; they cannot be treated as Friendly.

And, as usual, that which can be destroyed by the truth should be. If moderator actions start serving some force other than truth and good, LW, or at least the subset dedicated to truth and rationality, should be forked.

Comment author: Multiheaded 24 December 2012 07:15:17AM *  5 points [-]

I support censorship, but only if it is based on the unaccountable personal opinion of a human.

I think that there's the usual paradox of benevolent dictatorship here; you can only trust humans who clearly don't seek this position for selfish ends and aren't likely to present a rational/benevolent front just so you would give them political power.

In a liberal/democratic political atmosphere, self-proclaimed benevolent dictators are a rare and prized resource; you can pressure one to run a website, an organization, etc to the best of their ability. But if dictatorship were to be seen as the norm, and you couldn't easily fall back on democracy, rule by committee, anarchy, etc, and had to choose between a few dictators, then the standards of dictatorial control would surely plummet and it would be psychologically much more difficult to change the form of organization. So, IMO, isolated experiments with dictatorship are fine; overall preference for it is terribly dangerous.

(All of the above goes only for humans, of course; I have no qualms about FAI rule.)

P.S.: I googled for "benevolent dictator" + "paradox" and found an argument similar to mine.

Being governed by people instead of a system isn’t just dangerous, it suffers from a limited attention span, too. The Chinese oligarchy is, indeed, very effective. Beijing was cleaner for the Olympics and those pesky plastic bags are gone, but there is only so much bandwidth for the authorities to enforce regulation and address new concerns. Pollution is a serious problem in China that no one denies, but little is done so far. The people and the government are both troubled, but frankly, they have bigger fish to stir fry. Three hundred million people may be living middle class western lives, but that leaves another billion in a falling apart shack.

The Chinese have every reason to be proud of their beautiful country and amazing progress. There is much to enjoy and appreciate and, even if it pained me to admit it, their system works far better than I would like to give it credit. My worry for them is if it’s sustainable. Can those billion people rely on replacing great technocrats with new ones who also make the right decisions? Is it even possible for a system which depends on the vagaries of people to even effectively address all the concerns and needs of the people they govern and the society they guide?

Comment author: [deleted] 24 December 2012 07:22:35AM 2 points [-]

But if dictatorship were to be seen as the norm, and you couldn't easily fall back on democracy, rule by committee, anarchy, etc, and had to choose between a few dictators, then the standards of dictatorial control would surely plummet and it would be psychologically much more difficult to change the form of organization.

Interesting. Do you think there are dictator-selection procedures that don't have either set of failure modes (selecting for looks/promises to loot the commons/lack of leadership, selecting for power-hungry tyrants)?

Comment author: Multiheaded 24 December 2012 07:33:14AM *  2 points [-]

Do you think there are dictator-selection procedures that don't have either set of failure modes (selecting for looks/promises to loot the commons/lack of leadership, selecting for power-hungry tyrants)?

Only a single one: a great actually-benevolent-dictator, with a good insight into people and lots of rationality, personally selects his successor among several candidates, after lengthy consideration and hidden testing. But, of course, remove one of the above qualifiers, and it can blow up regardless of the first dictator's best intentions. See e.g. Marcus Aurelius and Commodus. So, on a meta level, no, there's likely no system that would work for humans.

(I think that "real" democracy is also too dangerous - see the 19th and early 20th century - so either some form of sophisticated rule by committee or a state of anarchy could be the safest option for baseline humanity.)

Comment author: [deleted] 24 December 2012 07:41:13AM *  1 point [-]

What about technocracy a-la china?

And FAI, obviously.

so either some form of sophisticated rule by committee or a state of anarchy could be the safest option for baseline humanity.

Really? Safe in the sense of "too incompetent to execute a mass-murder"? Also, anarchy is a military vacuum.