When I expect to meet new people who have no idea who I am, I often wear a button on my shirt that says:
SPEAK THE TRUTH,
EVEN IF YOUR VOICE TREMBLES
Honesty toward others, it seems to me, obviously bears some relation to rationality. In practice, the people I know who seem to make unusual efforts at rationality, are unusually honest, or, failing that, at least have unusually bad social skills.
And yet it must be admitted and fully acknowledged, that such morals are encoded nowhere in probability theory. There is no theorem which proves a rationalist must be honest - must speak aloud their probability estimates. I have said little of honesty myself, these past two years; the art which I've presented has been more along the lines of:
SPEAK THE TRUTH INTERNALLY,
EVEN IF YOUR BRAIN TREMBLES
I do think I've conducted my life in such fashion, that I can wear the original button without shame. But I do not always say aloud all my thoughts. And in fact there are times when my tongue emits a lie. What I write is true to the best of my knowledge, because I can look it over and check before publishing. What I say aloud sometimes comes out false because my tongue moves faster than my deliberative intelligence can look it over and spot the distortion. Oh, we're not talking about grotesque major falsehoods - but the first words off my tongue sometimes shade reality, twist events just a little toward the way they should have happened...
From the inside, it feels a lot like the experience of un-consciously-chosen, perceptual-speed, internal rationalization. I would even say that so far as I can tell, it's the same brain hardware running in both cases - that it's just a circuit for lying in general, both for lying to others and lying to ourselves, activated whenever reality begins to feel inconvenient.
There was a time - if I recall correctly - when I didn't notice these little twists. And in fact it still feels embarrassing to confess them, because I worry that people will think: "Oh, no! Eliezer lies without even thinking! He's a pathological liar!" For they have not yet noticed the phenomenon, and actually believe their own little improvements on reality - their own brain being twisted around the same way, remembering reality the way it should be (for the sake of the conversational convenience at hand). I am pretty damned sure that I lie no more pathologically than average; my pathology - my departure from evolutionarily adapted brain functioning - is that I've noticed the lies.
The fact that I'm going ahead and telling you about this mortifying realization - that despite my own values, I literally cannot make my tongue speak only truth - is one reason why I am not embarrassed to wear yon button. I do think I meet the spirit well enough.
It's the same "liar circuitry" that you're fighting, or indulging, in the internal or external case - that would be my second guess for why rational people tend to be honest people. (My first guess would be the obvious: respect for the truth.) Sometimes the Eli who speaks aloud in real-time conversation, strikes me as almost a different person than the Eliezer Yudkowsky who types and edits. The latter, I think, is the better rationalist, just as he is more honest. (And if you asked me out loud, my tongue would say the same thing. I'm not that internally divided. I think.)
But this notion - that external lies and internal lies are correlated by their underlying brainware - is not the only view that could be put forth, of the interaction between rationality and honesty.
An alternative view - which I do not myself endorse, but which has been put forth forcefully to me - is that the nerd way is not the true way; and that a born nerd, who seeks to become even more rational, should allow themselves to lie, and give themselves safe occasions to practice lying, so that they are not tempted to twist around the truth internally - the theory being that if you give yourself permission to lie outright, you will no longer feel the need to distort internal belief. In this view the choice is between lying consciously and lying unconsciously, and a rationalist should choose the former.
I wondered at this suggestion, and then I suddenly had a strange idea. And I asked the one, "Have you been hurt in the past by telling the truth?" "Yes", he said, or "Of course", or something like that -
(- and my brain just flashed up a small sign noting how convenient it would be if he'd said "Of course" - how much more smoothly that sentence would flow - but in fact I don't remember exactly what he said; and if I'd been speaking out loud, I might have just said, "'Of course', he said" which flows well. This is the sort of thing I'm talking about, and if you don't think it's dangerous, you don't understand at all how hard it is to find truth on real problems, where a single tiny shading can derail a human train of thought entirely -)
- and at this I suddenly realized, that what worked for me, might not work for everyone. I haven't suffered all that much from my project of speaking truth - though of course I don't know exactly how my life would have been otherwise, except that it would be utterly different. But I'm good with words. I'm a frickin' writer. If I need to soften a blow, I can do with careful phrasing what would otherwise take a lie. Not everyone scores an 800 on their verbal SAT, and I can see how that would make it a lot harder to speak truth. So when it comes to white lies, in particular, I claim no right to judge - and also it is not my primary goal to make the people around me happier.
Another counterargument that I can see to the path I've chosen - let me quote Roger Zelazny:
"If you had a choice between the ability to detect falsehood and the ability to discover truth, which one would you take? There was a time when I thought they were different ways of saying the same thing, but I no longer believe that. Most of my relatives, for example, are almost as good at seeing through subterfuge as they are at perpetrating it. I’m not at all sure, though, that they care much about truth. On the other hand, I’d always felt there was something noble, special, and honorable about seeking truth... Had this made me a sucker for truth's opposite?"
If detecting falsehood and discovering truth are not the same skill in practice, then practicing honesty probably makes you better at discovering truth and worse at detecting falsehood. If I thought I was going to have to detect falsehoods - if that, not discovering a certain truth, were my one purpose in life - then I'd probably apprentice myself out to a con man.
What, in your view, and in your experience, is the nature of the interaction between honesty and rationality? Between external truthtelling and internal truthseeking?
I'm starting to think that this is exactly correct.
As we all know, natural language sentences (encoded as pressure waves in the air, or light emitted from a monitor) aren't imbued with an inherent essence of trueness or falseness. Rather, we say a sentence is true when reporting it to a credulous human listener would improve the accuracy of that human's model of reality. For many sentences, this is pretty straightforward ("The sky is blue" is true if and only if the sky is blue, &c.), but in other cases it's more ambiguous, not because the sentence has an inherently fuzzy truth value, but because upon interpreting the sentence, the correspondence between the human's beliefs and reality could improve in some aspects but not others; e.g., we don't want to say "The Earth is a sphere" is false, even though it's really more like an oblate spheroid and has mountains and valleys. This insight is embedded in the name of the site itself: "Less Wrong," suggesting that wrongness is a quantitative rather than binary property.
But if sentences don't have little XML tags attached to them, then why bother drawing a bright-line boundary around "lying", making a deontological distinction where lying is prohibited but it's okay to achieve similar effects on the world without technically uttering a sentence that a human observer would dub "false"? It seems like a form of running away from the actual decision problem of figuring out what to say. When I'm with close friends from my native subculture, I can say what I'm actually thinking using the words that come naturally to me, but when I'm interacting with arbitrary people in society, that doesn't work as a matter of cause and effect, because I'm often relying on a lot of concepts and vocabulary that my interlocutor hasn't learned (with high probability). If I actually want to communicate, I'm going to need a better decision criterion than my brain's horrifyingly naive conception of honesty, and that's going to take consequentialist thinking (guessing what words will produce what effect in the listener's mind) rather than moralistic thinking (Honesty is Good, but Lying is Bad, so I'm not Allowed to say anything that could be construed as a Lie, because then I would be a Bad Person). The problem of "what speech acts I should perform in this situation" and the problem of having beliefs that correspond to reality are separate problems with different success criteria; it really shouldn't be surprising that one can do better on both of them by optimizing them separately.
Looking back on my life, moralistic reasoning---thinking in terms of what I or others "should" do, without having a consequentialist reduction of "should"---has caused me a lot of unnecessary suffering, and it didn't even help anyone. I'm proud that I had an internalized morality and that I cared about doing the Right Thing, but my conception of what the Right Thing was, was really really stupid and crazy, and people tried to explain to me what I was doing wrong, and I still didn't get it. I'm not going to make that (particular) mistake again (in that particular form).