I don't have a good handle on what you're saying here -- is it the old epistemic vs. instrumental rationality thing?
I've been accused of using "Dark Arts" when I'm not necessarily trying to deceive my readers. I believe I was making the argument that "You will have better luck getting your aims achieved if you talk about them with great confidence, as if you know you're right." I think this is absolutely true. On the other hand, yeah, I'm endorsing lying. (In this case, a subtle lie, a lie about philosophy, not a lie like "Officer, there's nothing in the trunk of my car.")
I've been accused of using "Dark Arts" when I'm not necessarily trying to deceive my readers. I believe I was making the argument that "You will have better luck getting your aims achieved if you talk about them with great confidence, as if you know you're right." I think this is absolutely true. On the other hand, yeah, I'm endorsing lying. (In this case, a subtle lie, a lie about philosophy, not a lie like "Officer, there's nothing in the trunk of my car.")
I've been grappling with this matter for years. Some points:
I think that whether or not talking about one's beliefs with great confidence, as if one knows one is right is conducive to achieving one's aims is situation dependent. Sometimes presenting a one-sided or overstated view can damage one's credibility and make others less receptive to what one has to say.
I think that presenting information asymmetrically can promote epistemic rationality. It sometimes happens that people have previously been exposed to information asymmetrically so that presenting information asymmetrically in the opposite direction is a faster way of conveying accurate information to the said people than presenting a balanced view would be.
I think that people who have high credibility/influence/authority should hesitate to make statements that they are uncertain about with great confidence as there's a danger of such people being taken more seriously than others who have better information than they do.
I would add to multifoliaterose's points that lying for the greater good works best when you are very confident that you won't be found out. It sounds like someone noticed your exaggeration of confidence and called you on it, and that undermined what you were trying to achieve. This is usually the risk of lying.
On a side note, I wonder about the situation where one is so confident in one's goal as to be willing to bend the truth to accomplish it, but not so confident that one can convince anyone else to help without bending the truth.
On a side note, I wonder about the situation where one is so confident in one's goal as to be willing to bend the truth to accomplish it, but not so confident that one can convince anyone else to help without bending the truth.
I wonder about this too.
A possible source of examples of net harm done by the practice of distorting the truth ostensibly for the greater good is provided by the fact that many charities distort the truth to fundraise. See for example the GiveWell blog postings:
•When is a charity's logo a donor illusion?
•Robin Hood, Smile Train and the “0% overhead” donor illusion
Presumably the people responsible for these illusions tell themselves that using them is justified for the greater good. I suspect that the use of these illusions does more harm than good (according to some sort of utilitarian metric) on account of resulting in misdirected funds and damaging the philanthropic sector's credibility. As Elie Hassenfeld says in Why are we always criticizing charities?
The problem is: because the nonprofit sector is saturated with unsubstantiated claims of impact and cost-effectiveness, it’s easy to ignore me when I tell you (for example), “Give $1,000 to the Stop Tuberculosis Partnership, and you’ll likely save someone’s life (perhaps 2 or 3 lives).” It’s easy to respond, “You’re just a cheerleader” or “Why give there when Charity X makes an [illusory] promise of even better impact?”
On the other hand, maybe my belief here is influenced by generalizing from one example and selection effects which have given me a misleading impression of what most donors are like - not sure.
I also object to the idea that any time you approach a question like "will arguing X right now advance my goals?" by rationally evaluating all terms in the expected utility equation, you're being evil, a la "dark arts".
I admit I wasn't very clear -- let me clarify: I see people making decisions to act based solely on their degree of belief in a particular statement (see the four examples in the original post). To figure out whether a particular action is in your interests, it's never sufficient just to evaluate probabilities: you can see in the expected utility equation that you simply can't get away from evaluating your utility function.
I think the evangelism ones are good examples, because they're easy to flip. "I don't believe there's a god -> I should tell everyone this whether they want to hear it or not" is the same logic as "I believe there's a god -> I should tell everyone this whether they want to hear it or not." (Note that this is not the same thing as saying that the logic behind the belief is the same, just that the jump from the belief to the action is.) I don't know anyone who isn't annoyed by at least one of those actions.
Thanks for making this post. I agree with your remarks, especially
I've noticed various friends and some people on this site making just this mistake. It's as if their love for truth and rational enquiry, which is a great thing in its own right, spills over into a conviction to act in a particular way, which itself is of questionable optimality.
In recent months there have been several posts on LessWrong about the "dark arts", which have mostly concerned using asymmetric knowledge to manipulate people. I like these posts, and I respect the moral stance implied by their name, but I fear that "dark arts" is becoming applicable to the much broader case of not acting according to the simple rule that decisions are always good when they sound like true beliefs.
I've made related posts:
Existential Risk and Public Relations
Reason is not the only means of overcoming bias
Reflections on a Personal Public Relations Failure: A Lesson In Communication.
Yes, I think that's true, but my point was even more specific: there is a very particular class of decisions that sound a lot like beliefs, but they should be evaluated as decisions, not beliefs.
I find it fairly simple. I would rather know the truth than live in delusion, personally. Myself. I don't see where the implication is to destroy someone else's beliefs. Only to modify your own accordingly in light of truth.
It's hubris to believe in a single ascertained truth (yours) and thus invalidate and dictate another's truth.
If you are an atheist and your friend is not, the truth is that you disagree. Their truth is a God exists. This should not destroy your friendship, but if it is so fragile that it does, it's better the irreconcilable relationship ends.
Another category: There are people who hate your guts. No matter who you are, you're the wrong color, the wrong weight, the wrong gender, the wrong nationality-- that that's not even getting to anyone who hates you personally rather than as part of a category.
What's on their minds is part of the truth, but knowing the details isn't going to be good for most people's equanimity.
I've been throwing some ideas around in my head, and I want to throw some of them half-formed into the open for discussion here.
I want to draw attention to a particular class of decisions that sound much like beliefs.
Belief
Decision
There is no personal god that answers prayers.
I should badger my friend about atheism.
Cryonics is a rational course of action.
To convince others about cryonics, I should start by explaining that if we exist in the future at all, then we can expect it to be nicer than the present on account of benevolent super-intelligences.
There is an objective reality.
Postmodernists should be ridiculed and ignored.
1+1=2
If I encounter a person about to jump unless he is told "1+1=3", I should not acquiesce.
I've thrown ideas from a few different bags into the table, and I've perhaps chosen unnecessarily inflammatory examples. There are many arguments to be had about these examples, but the point I want to make is the way in which questions about the best course of action can sound very much like questions about truth. Now this is dangerous because the way in which we chose amongst decisions is radically different from the way in which we chose amongst beliefs. For a start, evaluating decisions always involves evaluating a utility function, whereas evaluating beliefs never does (unless the utility function is explicitly part of the question). By appropriate changes to one's utility function the optimal decision in any given situation can be modified arbitrarily whilst simultaneously leaving all probability assignments to all statements fixed. This should make you immediately suspicious if you ever make a decision without consulting your utility function. There is no simple mapping from beliefs to decisions.
I've noticed various friends and some people on this site making just this mistake. It's as if their love for truth and rational enquiry, which is a great thing in its own right, spills over into a conviction to act in a particular way, which itself is of questionable optimality.
In recent months there have been several posts on LessWrong about the "dark arts", which have mostly concerned using asymmetric knowledge to manipulate people. I like these posts, and I respect the moral stance implied by their name, but I fear that "dark arts" is becoming applicable to the much broader case of not acting according to the simple rule that decisions are always good when they sound like true beliefs. I shouldn't need to argue explicitly that there are cases when lying or manipulating constitute good decisions; that would privileged a very particular hypothesis (namely that decisions are always good when they sound like true beliefs).
This brings be all the way back to the much-loved quotation, "that which can be destroyed by the truth should be". Now there are several ways to interpret the quote but at least one interpretation implies the existence of a simple isomorphism from true beliefs to good decisions. Personally, I can think of lots of things that could be destroyed by the truth but should not be.