I find that there is often a conflict between a motivation to speak only the truth and a motivation to successfully communicate as close approximations to the most relevant truths as constraints of time, intelligence and cultural conversational conventions allow.
a born nerd, who seeks to become even more rational, should allow themselves to lie, and give themselves safe occasions to practice lying, so that they are not tempted to twist around the truth internally
I'm starting to think that this is exactly correct.
As we all know, natural language sentences (encoded as pressure waves in the air, or light emitted from a monitor) aren't imbued with an inherent essence of trueness or falseness. Rather, we say a sentence is true when reporting it to a credulous human listener would improve the accuracy of that human's model of reality. For many sentences, this is pretty straightforward ("The sky is blue" is true if and only if the sky is blue, &c.), but in other cases it's more ambiguous, not because the sentence has an inherently fuzzy truth value, but because upon interpreting the sentence, the correspondence between the human's beliefs and reality could improve in some aspects but not others; e.g., we don't want to say "The Earth is a sphere" is false, even though it's really more like an oblate spheroid and has mountains and valleys. This insight is embedded in the name of the site itself: "Less Wrong," su...
I have a small theory - "enhancing reality" is normal part of human social interactions, we are designed to lie exactly for this reason, and not doing it properly hurts your signaling skills and lowers your ability to achieve your social goals (including getting mates and so on).
So based on the premise that rationality is about success, rationalists should have no qualms about lying when situation is right. I'm quite good at lying, and also I'm pretty sure I'm extremely honest with myself.
It is a good question, but I fear I'm lacking data to answer. It is much harder to see my own self-deceptions than my lies, making it hard to see the sign of the correlation between them.
I have noticed that, in addition to being honest, rationalists (or those striving toward rationality) seem to also speak very precisely (although not necessarily accurately) . This is a trait that they seem to share in common with philosophers, lawyers, programmers, etc.
I suspect this is because these people recognize the confusion caused by the vagueness of language.
Agreed about the non-universality and impact of personal history. I got an 800 SAT verbal and have been reasonably popular and hanging with a smart and very accepting crowd from college on. I have also never had to financially rely on any entity more conservative than Google. And I live more unusual and transparent life than most people, being open about things that most people are private about.
It is tempting to think that my open approach is the best way for everyone, but the fact is that I have not suffered for my openness at all and lots of other pe...
When I noticed this sort of thing in myself (I don't remember exactly when, but probably in my teens), I started intentionally pausing and rehearsing what I was about to say before saying it, in most situations. This might or might not have made me less of a liar, but it sure made me say less, because after vetting what I'm about to say, it's often the case that I don't feel the need to actually say it. Most things I would say in person don't seem worth saying once I've reviewed them for a bit.
In general I'm honest with people I think are at least moderately sane and high in agreeableness and openness, in a cooperative context. Otherwise I tend to think of myself as an actor in an improvised play that's "based on a true story." I don't seem to have too much difficulty keeping track of this, but I admit the possibility I'm self-deceived.
There are competitions that honest people regularly do poorly at, because liars have an advantage. Mating and attracting venture capital are examples. I knew a kid who claimed to have had sex with 100 women by the time he was 20. "How?" I asked. "I told them all I loved them," he said.
(Though, in my experience, telling women you love them before you've had sex with them is more likely to get you labeled 'creepy' than get you laid. Maybe it's different for teenagers.)
Have you considered the possibility that this dishonest person was not honest about how many women he'd had sex with?
If detecting falsehood and discovering truth are not the same skill in practice, then practicing honesty probably makes you better at discovering truth and worse at detecting falsehood. If I thought I was going to have to detect falsehoods - if that, not discovering a certain truth, were my one purpose in life - then I'd probably apprentice myself out to a con man.
I've heard from many sources that con men are actually among the easiest to deceive. The rationale seems to be along the lines that con men have utter contempt for their marks, and therefore...
I never before thought of poor social skills as simply a disregard for the standard signals. I have a problem with that interpretation though because I often find that groups with poor social skills have their own sets of in-group signals and that they jockey for position just as much as the rest of us.
We're all operating using human brains. It simply isn't safe to assume we don't lie or decieve ourselves constantly. The failure to notice self-deceptions is, I expect, one of the most pervasive forms of akrasia, if not, obviously, one of the most self-evident ones.
I doubt focused honesty or even radical honesty will have a major effect on the frequency of self-deception. It seems too much like sympathetic magic. So I expect it to work like any other placebo ritual.
I expect combating self-deception will require working against the akrasia directly. But most anti-akrasia approaches asume at least a small window of consciousness of the akrasia, which in this case is often not possible.
I don't think Zelazny's statement makes out that "detecting falsehood and discovering truth are not the same skill in practice". He just seems to be saying that you can have good 'detecting falsehood' skills without caring much about the truth ("I’m not at all sure, though, that they care much about truth").
If I thought I was going to have to detect falsehoods - if that, not discovering a certain truth, were my one purpose in life - then I'd probably apprentice myself out to a con man.
I think that's equating 'detecting falsehood...
I know I'm a really crap rationalist. Lots of the ways I fail at reason are in the category of "lying to myself", "making things easier by adopting a role", "maintaining a bubble of cosy normality". When I'm not lying to myself overtly, manipulating others can be a means to untruth-maintenance.
Honestly, even if lying were useful, I don't think I can trust myself with it.
Maybe a graduate Beisutsukai can lie, but an student ought to be scrupulous with truth.
I suggest that competitive bear races, metaphorically speaking, are so rare that even looking for them is positively harmful. Nearly all tests of reason, from frying with hot fat to investing your money to buying cryonics, the only people in the race are you and the bear, and you really do have to be faster.
Aside: I don't think its really verbal ability, e.g. as measured by the SAT, which truthtelling requires. What is more important is having a good theory of mind and knowing what deserves emphasis. An aspect of emotional intelligence, no?
What, in your view, and in your experience, is the nature of the interaction between honesty and rationality? Between external truthtelling and internal truthseeking?
Could it be as simple as forming a habit?
Being honest as much as possible 'by default' turns into a habit that can be very helpful when it comes to being rational. I have the feeling that external truthtelling helps form that habit, which leads to fewer internal lies. But I can't prove it (maybe it's just me).
The chain goes something like: habit of external truth -> helps internal trut...
After I first read this article about a year ago, I set out to be more honest in all my conversations. (At this point in time it has become a part of my persona and I no longer do it consciously.) There are a few things I've noticed since I made the switch:
It is easier for me to think clearly during social events. I suspect this is because I no longer have to generate lies and keep track of all of them.
I have become more outgoing, although undoubtedly more socially awkward. Occasionally, a person will be shocked at how carelessly I reveal something con
I think there is a link between external truth-telling and external truth-seeking.
Say I make a decision for the reason that it is my best guess at the right thing to do in a given situation, based on the facts I have. Suppose further it is not viewed as the right thing to do for social or political reasons among my peers. (This has happened to me now and then.)
I'd like to be able to defend my decision against vaguely stated, uninformed, logically inconsistent or otherwise irrational objections, by reference to measurable and demonstrable facts. I can o...
A 'Machiavellian rationalist' would only speak truths when in its in his/her own best interests, and lie when that is more useful.
However, I think Eliezer wants to be a 'friendly rationalist'[1]. Then, speaking the truth becomes the optimal thing in many more cases. It could also make it harder to succeed in fields like trade, politics, war, where bluff, misrepresentations etc. are important. And what about 'Daddy, do you like this drawing I spent three hours making for you?'
And sometimes lying seems simply the best choice - no matter what Kant thinks. The...
From what i've read and able to understand(a little) after spending 3 months reading this blog is that rationality is (just to put what has already been said lot of times by eliezer) "something that helps us getting more of what we want", please correct me if i have got it dead wrong, else i might be "de-rationalised" in a rational way. Im a pathalogical liar, i have little to hide (nobody who knows me visits this blog, i think, and if somebody does and happens to read this, then he wouldn't mind it i'm sure).
"........- the theory ...
If detecting falsehood and discovering truth are not the same skill in practice, then practicing honesty probably makes you better at discovering truth and worse at detecting falsehood.
It seems to me that the hard part of both is being "more confused by fiction than reality", and that the only relevant additional concern when detecting falsehood to have a reasonable prior on deliberate deception.
The problem, as I see it, is that it's not possible to lie to people and simultaneously act in their interests.
Lying to someone is an inherently hostile act, and indicates that (at least in regard to the matter at hand) you're enemies. There may be very special cases in which it's the act of an ally or friend (in the same way that semi-hostile microorganisms may be beneficial to our immune functioning) but they must be quite rare.
Even if some of the consequences of the lie are 'good', you're reinforcing the other person's tendencies towards irrationality by getting them to believe untruths and profiting by it.
Truth-telling is necessary but not sufficient for honesty. Something more is required: an admission of epistemic weakness. You needn't always make the admission openly to your audience (social conventions apply), but the possibility that you might be wrong should not leave your thoughts. A genuinely honest person should not only listen to objections to his or her favorite assumptions and theories, but should actively seek to discover such objections.
What's more, people tend to forget that their long-held assumptions are assumptions and treat them as facts. Forgotten assumptions are a major impediment to rationality--hence the importance of overcoming bias (the action, not the blog) to a rationalist.
Now that you mention it, I have kind of noticed this. Social skills appear to consist of a certain amount of glossing over details that you don't want to derail the more important thought processes. You don't mention aloud exactly everything you are thinking, because if you do you end up labeled "dork".
Writing/typing is a little different because you know in advance that whatever comes out is not going to be percieved immediately by anyone, so it goes through fewer filters at first. Afterwards you have the chance to edit it to be less embarassing...
For a long long time I've tried not to say anything false. Usually this take the form of caveats or 'weasel words'. This can make me seem less confident than I actually am, so I guess they're misleading in their own way.
Not sure how this influences self-deception. It probably makes it easier for me to misremember my confidence.
In practice, the people I know who seem to make unusual efforts at rationality, are unusually honest, or, failing that, at least have unusually bad social skills.
It's the same "liar circuitry" that you're fighting, or indulging, in the internal or external case - that would be my second guess for why rational people tend to be honest people.
I have another alternate hypotheses: most normal people are such poor rationalists, that it simply isn't worth the effort to develop proper "lying skills" (if that's an acceptable term - yes, i kn...
My view of the relationship between honesty and rationality is similar to taw's theory 'enhanced reality' in his comment on this page, but I would think of it as a 'augmented reality theory', I don't think we are designed to lie just that the truth is normally very complex, and we are designed to simplify.
I think there are two basic factors that limit honesty Language, and the way the human mind works.
LANGUAGE
Even a person who is trying to tell the truth and be honest is always going to have language problems, it's important to remember that new words ar...
E.-
I think you are refining yourself for very good reasons. It could happen that you help 'raise' a seed AI someday. Like the quote on your pin there is, 'be the person you are seeking'. Perhaps your personal honesty practices are very important for your AI work. If you ever find yourself working with an evolving machine consciousness, it might be very good that you've taken time to search yourself.
I like your comment about feeling your nervous system change while run into that inconvenience-of-reality-what-to-say-or-do-myaaa!... thing. I think it's...
There is no theorem which proves a rationalist must be honest
I'd like to see the non-existence proof of that claim. Absent that, it should really start "I am not aware of the existence of a theorem which..."
How do we know that there is no such theorem?
When I expect to meet new people who have no idea who I am, I often wear a button on my shirt that says:
SPEAK THE TRUTH,
EVEN IF YOUR VOICE TREMBLES
Honesty toward others, it seems to me, obviously bears some relation to rationality. In practice, the people I know who seem to make unusual efforts at rationality, are unusually honest, or, failing that, at least have unusually bad social skills.
And yet it must be admitted and fully acknowledged, that such morals are encoded nowhere in probability theory. There is no theorem which proves a rationalist must be honest - must speak aloud their probability estimates. I have said little of honesty myself, these past two years; the art which I've presented has been more along the lines of:
SPEAK THE TRUTH INTERNALLY,
EVEN IF YOUR BRAIN TREMBLES
I do think I've conducted my life in such fashion, that I can wear the original button without shame. But I do not always say aloud all my thoughts. And in fact there are times when my tongue emits a lie. What I write is true to the best of my knowledge, because I can look it over and check before publishing. What I say aloud sometimes comes out false because my tongue moves faster than my deliberative intelligence can look it over and spot the distortion. Oh, we're not talking about grotesque major falsehoods - but the first words off my tongue sometimes shade reality, twist events just a little toward the way they should have happened...
From the inside, it feels a lot like the experience of un-consciously-chosen, perceptual-speed, internal rationalization. I would even say that so far as I can tell, it's the same brain hardware running in both cases - that it's just a circuit for lying in general, both for lying to others and lying to ourselves, activated whenever reality begins to feel inconvenient.
There was a time - if I recall correctly - when I didn't notice these little twists. And in fact it still feels embarrassing to confess them, because I worry that people will think: "Oh, no! Eliezer lies without even thinking! He's a pathological liar!" For they have not yet noticed the phenomenon, and actually believe their own little improvements on reality - their own brain being twisted around the same way, remembering reality the way it should be (for the sake of the conversational convenience at hand). I am pretty damned sure that I lie no more pathologically than average; my pathology - my departure from evolutionarily adapted brain functioning - is that I've noticed the lies.
The fact that I'm going ahead and telling you about this mortifying realization - that despite my own values, I literally cannot make my tongue speak only truth - is one reason why I am not embarrassed to wear yon button. I do think I meet the spirit well enough.
It's the same "liar circuitry" that you're fighting, or indulging, in the internal or external case - that would be my second guess for why rational people tend to be honest people. (My first guess would be the obvious: respect for the truth.) Sometimes the Eli who speaks aloud in real-time conversation, strikes me as almost a different person than the Eliezer Yudkowsky who types and edits. The latter, I think, is the better rationalist, just as he is more honest. (And if you asked me out loud, my tongue would say the same thing. I'm not that internally divided. I think.)
But this notion - that external lies and internal lies are correlated by their underlying brainware - is not the only view that could be put forth, of the interaction between rationality and honesty.
An alternative view - which I do not myself endorse, but which has been put forth forcefully to me - is that the nerd way is not the true way; and that a born nerd, who seeks to become even more rational, should allow themselves to lie, and give themselves safe occasions to practice lying, so that they are not tempted to twist around the truth internally - the theory being that if you give yourself permission to lie outright, you will no longer feel the need to distort internal belief. In this view the choice is between lying consciously and lying unconsciously, and a rationalist should choose the former.
I wondered at this suggestion, and then I suddenly had a strange idea. And I asked the one, "Have you been hurt in the past by telling the truth?" "Yes", he said, or "Of course", or something like that -
(- and my brain just flashed up a small sign noting how convenient it would be if he'd said "Of course" - how much more smoothly that sentence would flow - but in fact I don't remember exactly what he said; and if I'd been speaking out loud, I might have just said, "'Of course', he said" which flows well. This is the sort of thing I'm talking about, and if you don't think it's dangerous, you don't understand at all how hard it is to find truth on real problems, where a single tiny shading can derail a human train of thought entirely -)
- and at this I suddenly realized, that what worked for me, might not work for everyone. I haven't suffered all that much from my project of speaking truth - though of course I don't know exactly how my life would have been otherwise, except that it would be utterly different. But I'm good with words. I'm a frickin' writer. If I need to soften a blow, I can do with careful phrasing what would otherwise take a lie. Not everyone scores an 800 on their verbal SAT, and I can see how that would make it a lot harder to speak truth. So when it comes to white lies, in particular, I claim no right to judge - and also it is not my primary goal to make the people around me happier.
Another counterargument that I can see to the path I've chosen - let me quote Roger Zelazny:
If detecting falsehood and discovering truth are not the same skill in practice, then practicing honesty probably makes you better at discovering truth and worse at detecting falsehood. If I thought I was going to have to detect falsehoods - if that, not discovering a certain truth, were my one purpose in life - then I'd probably apprentice myself out to a con man.
What, in your view, and in your experience, is the nature of the interaction between honesty and rationality? Between external truthtelling and internal truthseeking?