pjeby comments on It's okay to be (at least a little) irrational - Less Wrong
You are viewing a comment permalink. View the original post to see all comments and the full post content.
You are viewing a comment permalink. View the original post to see all comments and the full post content.
Comments (58)
What does that even mean? Reality doesn't contain any little <OK></OK> xml tags on concrete objects, let alone ill-defined abstractions like "irrational".
Asserting that anything is "OK" or "Not OK" properly belongs to the dark arts of persuasion and motivation, not to the realm of objective reality.
This is an extraordinary claim; the scientific evidence weighs overwhelmingly against you, in that it is clearly more useful to be drawn to live up to an incorrect, flattering future self-image, than to focus on an image of yourself that is currently correct, but unflattering.
This looks like a fully general argument against all reason, against characterizing anything with any property, applied to the purpose of attacking (a connotation of?) my judgment.
What do you mean by currently correct? Correctness of a statement doesn't change over time.
I refuse to cherish a flattering self-image that is known to be incorrect. How would it be useful for me to start believing a lie? I'm quite motivated and happy as I am, thank you very much.
...one that can be trivially remediated by rephrasing your original statement in E-prime.
I mean that one can be aware that one is currently a sinner, while nonetheless aspiring to be a saint. The people who are most successful in their fields continuously aspire to be better than anyone has ever been before... which is utterly unrealistic, until they actually achieve it. Such falsehoods are more useful to focus on than the truth about one's past.
If you win, but you were sure you'll lose, then you were no less wrong than if you believed that you could succeed where it can't happen. People are uncertain about their future, and about their ability, but this uncertainty, this limited knowledge is about what can actually happen. If you really can succeed in achieving what was never seen before, your aspirations are genuine. What you know about your past is about your past, and what you make of your future is a different story entirely.
Have you ever heard the saying, "If you shoot for the moon and miss... you are still among the stars?" It's more useful to aim high and fail, than to aim low by being realistic.
You are repeating yourself without introducing new arguments.
I hate to harp on about (time and) relative distances in space but if you shoot for the Moon and miss, you are barely any closer to the stars than you were when you started.
More seriously, you don't seem to be answering Vladimir_Nesov's point at all, which is that if you think that such optimism can result in winning, then the optimism isn't irrational in the first place, and it was the initial belief of impossibility that was mistaken.
Was that really his point? If so, I missed it completely; probably because that position appears to directly contradict what he said in his previous comment.
More precisely, he appeared to be arguing that making wrong predictions (in the sense of assigning incorrect probabilities) is "not OK".
However, in order to get the benefit of "shooting for the moon", you have to actually be unrealistic, at the level of your brain's action planning system, even if intellectually you assign a different set of probabilities. (Which may be why top performers are often paradoxically humble at the same time as they act as if they can achieve the impossible.)
Yes, that really was one of the things I argued in my recent comments.
Are you arguing for the absolute necessity of doublethink? Is it now impossible to get to the high levels of achievement without doublethink?
See also: Striving to Accept.
Well, it seems a little tautological to me: only in hindsight can you be sure that your optimism was rational. At the time of your initial optimism, it may be "irrational" from a strictly mathematical perspective, even after taking into account the positive effects of optimism. Note, for example, the high rate of startup failure; if anybody really believed the odds applied to them, nobody would ever start one.
I am not claiming that success requires "doublethink", in the sense of believing contradictory things. I'm only saying that an emotional belief in success is relevant to your success. What you think of the matter intellectually is of relatively little account, just as one's intellectual disbelief in ghosts has relatively little to do with whether you'll be able to sleep soundly in a "haunted" house.
The main drivers of our actions are found in the "near" system's sensory models, not the "far" system's abstract models. However, if the "near" system is modelling failure, it is difficult for the "far" system to believe in success.... which leads to people having trouble "believing" in success, because they're trying to convince the far mind instead of the near one. Or, they succeed in wrapping the far system in double-think, while ignoring the "triple think" of the near system still predicting failure.
In short, the far system and your intellectual thoughts don't matter very much. Action is not abstraction.
If you have to strive to believe something -- with either the near OR far system -- you're doing it wrong. The near system in particular is ridiculously easy to change beliefs in; all you have to do is surface all of the relevant existing beliefs first.
Striving, on the other hand, is an indication that you have conflicting beliefs in play, and need to remove one or more existing ones before trying to install a new one.
(Note: I'm not an epistemic rationalist, I'm an instrumental one. Indeed, I don't believe that any non-trivial absolute truths are any more knowable than Godel and Heisenberg have shown us they are in other sorts of systems. I therefore don't care which models or beliefs are true, only which ones are useful. To the extent that you care about the "truth" of a model, you will find conversing with me frustrating, or at least uninformative.)
Sigh.
Wrong. When you act under uncertainty, the outcome is not the judge of the propriety of your reason, although it may point out a probable problem.
I understand that the connection isn't direct, and in some cases may be hard to establish at all, but you are always better off bringing all sides of yourself to agreement.
Yet you can't help but care which claims about models being useful are true.
AFAIK, "really believe" is used to mean both "emotionally accept" and "have as a deliberative anticipation-controller". I take it you mean the first, but given the ambiguity, we should probably not use the term. Just a suggestion.
Off-topic: See The So-Called Heisenberg Uncertainty Principle.