I think you entirely missed the point.
As best I understood it, the point was that one's belief in one's own goodness is a source of drive - and if that goodness is false, the drive is misaimed, and the greater drive makes for greater ill consequences.
I think we agree that belief in one's own goodness has the capability to go quite wrong, in such cases as the quote describes more wrong than an all-other-things-being-equal belief in one's own evil. Where we seem to disagree is on the inevitability of this failure mode - I acknowledge that the failure mode exists and we should be cautious about it (although that may not have come across), whereas you seem to be implying that the failure mode is so prevalent that it would be better not to try to be a good overlord at all.
Am I understanding your position correctly?
The application to Coherent Extrapolated Volition is left as an exercise.
An important part of the quote, it seems, is "may be" the most oppressive. Only if the goodness of these "omnipotent moral busybodies" is actually so different from our own that we suffer under it is there an issue; a goodness well-executed would perhaps never even be called a tyranny at all.
I'm a recent import to the Cambridge area - what should I know/do/ask before coming to one of these? Should I just show up at the doorstep? Or are there protocols, permissions, and snacks to consider?
'What is a RG machine? Seems a bit abruptly introduced.' (I'll Google it, of course.)
A Rube Goldberg machine is a device which accomplishes a (relatively) simple task in a ludicrously complicated and suboptimal way, such as a device which squeezes one's toothpaste through an intricate system of pulleys, weights, and trained hamsters. While it gets the job done (eventually), it's incredibly inefficient, and most of the effects it produces are intermediate products.
Suppose now that there were two such magic [invisibility] rings [of Gyges], and the just put on one of them and the unjust the other; no man can be imagined to be of such an iron nature that he would stand fast in justice. No man would keep his hands off what was not his own when he could safely take what he liked out of the market, or go into houses and lie with any one at his pleasure, or kill or release from prison whom he would, and in all respects be like a god among men.
Then the actions of the just would be as the actions of the unjust; they would both come at last to the same point. And this we may truly affirm to be a great proof that a man is just, not willingly or because he thinks that justice is any good to him individually, but of necessity, for wherever any one thinks that he can safely be unjust, there he is unjust.
For all men believe in their hearts that injustice is far more profitable to the individual than justice, and he who argues as I have been supposing, will say that they are right. If you could imagine any one obtaining this power of becoming invisible, and never doing any wrong or touching what was another's, he would be thought by the lookers-on to be a most wretched idiot, although they would praise him to one another's faces, and keep up appearances with one another from a fear that they too might suffer injustice.
Glaucon, in Plato's Republic
Why is this a rationality quote?
We so often confuse “what can be translated into print well” with “what is important and interesting.”
We also confuse "what is important" with "what is interesting" fairly often.
I disagree with the object level of this quote. Censorship can achieve multiple goals, and a lack of censorship does not necessarily imply "a regime of robust discussion."
Examples of the first would be using the censorship itself as the action (e.g. a despot censoring religious minorities doesn't just limit discussion, it's an active method of subjugation), or protecting people from messages with annoying content or form (e.g. regulations on advertising).
The second is nearly a human universal, but is especially clear in propaganda situations - if we're at war, and someone is spreading slanderous enemy propaganda, and I destroy their materials and arrest them, this is censorship. But it also increases the robustness of discussion, because they were trying to inject falsehoods into the discussion. Or for another example - sometimes you have to ban trolls from your message board.
I also dislike the implications of this quote for any discussion where it shows up. Some times ad hominem arguments are right. But they're almost never productive, especially when cast in such general terms.
I wouldn't say that it's an ad hominem quote. I disagree with the premise - that censorship is a "default position regarding so many things" within progressivism - but I think that the link between censorship as a default position and a fear of the survivability-under-discussion of one's own ideas is a rationally visible one. Unlike a typical example of an ad hominem attack, the undesirable behavior (fiat elimination of competing ideas as a default response) is related to the conclusion (that the individual is afraid of the effects of competing ideas). It's oversimplified, but one can say only so much in a short quip.
Please let me know if what I just wrote makes sense to you. If it does, perhaps this comment might be good as a start for making a second attempt at communication - I think I articulated what I was trying to say better here than before.
It does - thank you for clarifying your point.
Having processed this a little more, I want to address some a couple of your implicit questions:
Q: Would you prefer to have faith in a guru and a community of likeminded people, or is it better to have faith in leprechauns?
A: I would prefer neither. My belief is that it is optimal to have faith in what you can determine to be true at the most fundamental level you are capable, and have openness to updating your opinion as you search for truth at a more and more fundamental level.
Q: If you must choose between leprechauns and gurus/communities, isn't it much more sane to choose gurus/communities?
A: This question is a red herring. The reason is that its not the real choice anyone reading this would be making.
You have chosen an example of faith that is obviously absurd and blind to attribute to me, so that you could make an argument to defeat me and win all in the same comment.
Actually, the point of my response was to illustrate that to say "all of these things are faith" is an incorrectly simplifying assumption. I did deliberately choose an absurd example of faith, not to attribute it to you, but to show the difference between one thing which you did explicitly claim is faith - trust in people - and another thing which would have to be an example of blind faith - belief in leprechauns. If you acknowledge that there is a real difference between the two, it would seem that I have misinterpreted your thesis.
View more: Next
Subscribe to RSS Feed
= f037147d6e6c911a85753b9abdedda8d)
But, from the inside, how to you tell the difference between doing actual good for others or being an omnipotent moral busybody?
Willingness to be critiqued? Self-examination and scrupulous quantities of doubt? This seems kind of like the wrong question, actually. "Actual good" is a fuzzy concept, if it even exists at all; a benevolent tyrant cares whether or not they are fulfilling their values (which, presumably, includes "provide others with things I think are good"). The question I would ask is how you tell the difference between actually achieving the manifestation of your values and only making a big show of it; presumably it's the latter that causes the problem (or at least the problem that you care about).
Then again, this comes from a moral non-realist who doesn't see a contradiction in having a moral clause saying it's good to enforce your morality on others to some extent, so your framework's results may vary.