//The point has already been made, that if you wish to truly be honest, it is not enough to speak the truth.
I generally don't tell people I'm an atheist (I describe my beliefs without using any common labels). Why? I know that if I say the words "I am an atheist," that they will hear the following concepts:
- I positively believe there is no God
- I cannot be persuaded by evidence any more than most believers can be persuaded by evidence, ie, I have a kind of faith in my atheism
- I wish to distance myself from members of religious tribes
As I said, the point has already been made; If I know that they will hear those false ideas when I say a certain phrase, how can I say I am honest in speaking it, knowing that I will cause them to have false beliefs? Hence the saying, if you wish to protect yourself, speak the truth. If you wish to be honest, speak so that truth will be heard.
Many a politician convincingly lies with truths by saying things that they know will be interpreted in a certain positive (and false) way, but which they can always defend as having been intended to convey some other meaning.
---
The New
There is a counterpart to this insight, come to me as I've begun to pay more attention to the flow of implicit social communication. If speaking the truth in a way you know will deceive is a lie, then perhaps telling a lie in a way that you know will communicate a true concept is not a lie.
I've relaxed my standards of truth-telling as I've come to understand this. "You're the best" and "You can do this" statements have been opened to me, no qualifiers needed. If I know that everyone in a group has to say "I have XYZ qualification," but I also know that no one actually believes anybody when they say it, I can comfortably recite those words, knowing that I'm not actually leading anybody to believe false things, and thus, am not being dishonest.
Politicians use this method, too, and I think I'm more or less okay with it. You see, we have a certain problem that arises from intellectual inequality. There are certain truths which literally cannot be spoken to some people. If someone has an IQ of 85, you literally cannot tell them the truth about a great number of things (or they cannot receive it). And there are a great many more people who have the raw potential to understand certain difficult truths, but whom you cannot reasonably tell these truths (they'd have to want to learn, put in effort, receive extensive teaching, etc).
What if some of these truths are pertinent to policy? What do you do, say a bunch of phrases that are "true" in a way you will interpret them, but which will only be heard as...
As what? What do people hear when you explain concepts they cannot understand? If I had to guess, very often they interpret this as an attack on their social standing, as an attempt by the speaker to establish themselves as a figure of superior ability, to whom they should defer. You sound uppity, cold, out-of-touch, maybe nerdy or socially inept.
So, then...if you're socially capable, you don't say those things. You give up. You can't speak the truth, you literally cannot make a great many people hear the real reasons why policy Z is a good idea; they have limited the vocabulary of the dialogue by their ability and willingness to engage.
Your remaining moves are to limit yourself to their vocabulary, or say something outside of that vocabulary, all the nuance of which will evaporate en route to their ears, and which will be heard as a monochromatic "I think I'm better than you."
The details of this dynamic at play go on and on, but for now, I'll just say that this is the kind of thing Scott Adams is referring to when he says that what Trump has said is "emotionally true" even if it "doesn't pass the fact checks" (see dialogue with Sam Harris).
In a world of inequality, you pick your poison. Communicate what truths can be received by your audience, or...be a nerd, and stay out of elections.
[Quick comprehension check]: I think that you are saying that it is important to acknowledge when our notions of truths and lies break down because saying a thing that is apparently "true" can have connotations we didn't intend, thus making it "false". And you're flipping it around to say that the opposite is also valid-- that you can say a thing which is apparently "false", yet the way that it's interpreted could make it more "true".
I think you are saying that there are other factors when communicating, which is the context you convey with your words, i.e. meaning that is imparted which is distinct from the actual referents of the words in your utterances. And that this meaning is also important to keep in mind because we can't "just" communicate with only the words themselves, apart from connotation / context. It's just part of the package.
I think that you then took this to show that there are often times where knowledge about true things isn't easily transmissible due to a lack of prerequisite knowledge. And that this has problems when that information might be important.
[Actual response, if the above was accurate]: I think the part of this essay about how trying to get across points is often difficult is important. There are certain tradeoffs to be wary of, like when someone asks you a question, and you give an abridged / simplified question to optimize more for communication rather than accuracy. (EX: Giving someone a stripped-down description of your medical condition when they ask you why you're taking a pill.)
Thus, one of the questions we might want to take out of this is "How can we convey information many inferential steps away from the other party, especially when it's beneficial to them?" which seems like it could be resolved several ways:
1) Take the time to build up their prerequisites.
2) Convince them you're competent / trustworthy such that they can defer to your judgment.
3) Tell them false things such that they do the thing the information would have convinced them to do.
(I don't really like these options. Feel free to take this as an open invitation to spend 3 minutes thinking of other things.)
Anyway, it's less clear to me that you can tell people false stuff to make them believe true stuff. It feels more like you can people false stuff to do either 2 or 3, but not 1.
Suppose that a vast group of statements that sound (they really, REALLY sound) like propositions about economic cause and effect are ALL interpreted by a great many people always and only as either "Yay blues" or "Boo blues."
In that case, your ability to tell the truth is limited by their way of filtering your statements, and your ability to tell lie is equally hampered. All you can do is decide whether to say Yay or Boo or not say anything at all (which will also often be interpreted one way or the other if you're involved in politics)... (read more)