Furcas comments on Accuracy Versus Winning - Less Wrong
You are viewing a comment permalink. View the original post to see all comments and the full post content.
You are viewing a comment permalink. View the original post to see all comments and the full post content.
Comments (72)
I'd say the benefits have to outweigh the costs. If you succeed in achieving your goal despite holding a significant number of false beliefs relevant to this goal, it means you got lucky: Your success wasn't caused by your decisions, but by circumstances that just happened to be right.
That the human brain is wired in such a way that self-deception gives us an advantage in some situations may tip the balance a little bit, but it doesn't change the fact that luck only favors us a small fraction of the time, by definition.
OK, I see you don't believe me that you should sometimes accept and sometimes reject epistemic rationality for a price. So here's a simple mathematical model:
Let's say agent A accepts the offer of increased epistemic rationality for a price, and agent N has not accepted it. P is the probability A will decide differently than N. F(A or N) is the expected value of N's original course of action as a function of the agent who takes it, while S(A) is the expected value of the course of action that A might switch to. If there is a cost C associated with becoming agent A, then agent N should become agent A if and only if
(1 - P) * F(A) + P * S(A) - C >= F(N)
The left side of the equation is not bigger than the right side "by definition"; it depends on the circumstance. Eliezer's dessert-ordering example is a situation where the above inequality does not hold.
If you complain that agent N can't possibly know all the variables in the equation, then I agree with you. He will be estimating them somewhat poorly. However, that complaint in no way supports the view that the left side is in fact bigger. Someone once said that "Anything you need to quantify can be measured in some way that is superior to not measuring it at all." Just like the difficulty of measuring utility is not a valid objection to utilitarianism, the difficulty of guessing what a better-informed self would do is not a valid objection to using this equation.
That's a funny definition of "luck" you're using.
Yes, the right side can be bigger, and occasionally it will be. If you get lucky.
If the information that N chooses to remain ignorant of happens to be of little relevance to any decision N will take in the future, and if his self-deception allows him to be more confident than he would have been otherwise, and if this increased confidence grants him a significant advantage, then the right side of the equation will be bigger than the left side.
It is? Why do you think people are pleasantly surprised when they get lucky, if not because it's a rare occurrence?
Not quite.
The information could be of high relevance, but it could so happen that it won't cause him to change his mind.
He could be choosing among close alternatives, so switching to a slightly better alternative could be of limited value.
Remember also that failure to search for disconfirming evidence doesn't necessarily constitute self-deception.
Sorry, I guess your definition of luck was reasonable. But in this case, it's not necessarily true that the probability of the right side being greater is lower than 50%. In which case you wouldn't always have to "get lucky".
I've been thinking about this on and off for an hour, and I've come to the conclusion that you're right.
My mistake comes from the fact that the examples I was using to think about this were all examples where one has low certainty about whether the information is irrelevant to one's decision making. In this case, the odds are that being ignorant will yield a less than maximal chance of success. However, there are situations in which it's possible to know with great certainty that some piece of information is irrelevant to one's decision making, even if you don't know what the information is. These situations are mostly those that are limited in scope and involve a short-term goal, like giving a favorable first impression, or making a good speech. For instance, you might suspect that your audience hates your guts, and knowing that this is in fact the case would make you less confident during your speech than merely suspecting it, so you'd be better off waiting after the speech to find out about this particular fact.
Although, if I were in that situation, and they did hate my guts, I'd rather know about it and find a way to remain confident that doesn't involve willful ignorance. That said, I have no difficulty imagining a person who is simply incapable of finding such a way.
I wonder, do all situations where instrumental rationality conflicts with epistemic rationality have to do with mental states over which we have no conscious control?
Wow, this must be like the 3rd time that someone on the internet has said that to me! Thanks!
If you think of a way, please tell me about it.
Information you have to pay money for doesn't fit into this category.
On the contrary: "luck" is a function of confidence in two ways. First, people volunteer more information and assistance to those who are confident about a goal. And second, the confident are more likely to notice useful events and information relative to their goals.
Those two things are why people think the "law of attraction" has some sort of mystical power. It just means they're confident and looking for their luck.
As the post hinted, self-deception can give you confidence which is useful in almost all real life situations, from soldier to socialite. Far from "tipping the balance a little bit", a confidence upgrade is likely to improve your life much more than any amount of rationality training (in the current state of our Art).
Too vague. It's not clear what is your argument's denotation, but connotation (becoming overconfident is vastly better than trying to be rational) is a strong and dubious assertion that needs more support to move outside the realm of punditry.
People who debate this often seem to argue for an all-or-nothing approach. I suspect the answer lies somewhere in the middle: be confident if you're a salesperson but not if you're a general, for instance. I might look like a member of the "always-be-confident" side to all you extreme epistemic rationalists, but I'm not.
I think a better conclusion is: be confident if you're being evaluated by other people, but cautious if you're being evaluated by reality.
A lot of the confusion here seems to be people with more epistemic than instrumental rationality having difficulty with the idea of deliberately deceiving other people.
But there is another factor: humans are penalized by themselves for doubt. If they (correctly) estimate their ability as low, they may decide not to try at all, and therefore fail to improve. The doubt's what I'm interested in, not tricking others.
A valid point! However, I think it is the decision to not try that should be counteracted, not the levels of doubt/confidence. That is, cultivate a healthy degree of hubris--figure out what you can probably do, then aim higher, preferably with a plan that allows a safe fallback if you don't quite make it.
If I could just tell myself to do things and then do them exactly how I told myself, my life would be fucking awesome. Planning isn't hard. It's the doing that's hard.
Someone could (correctly) estimate their ability as low and rationally give it a try anyway, but I think their effort would be significantly lower than someone who knew they could do something.
Edit: I just realized that someone reading the first paragraph might get the idea that I'm morbidly obese or something like that. I don't have any major problems in my life--just big plans that are mostly unrealized.
You may be correct, and as someone with a persistent procrastination problem I'm in no position to argue with your point.
But still, I am hesitant to accept a blatant hack (actual self-deception) over a more elegant solution (finding a way to expend optimal effort while still having a rational evaluation of the likelihood of success).
For instance, I believe that another LW commenter, pjeby, has written about the issues related to planning vs. doing on his blog.
Yeah, I've read some of pjeby's stuff, and I remember being surprised by how non-epistemically rational his tips were, given that he posts here. (If I had remembered any of the specific tips, I probably would have included them.)
If you change your mind and decide to take the self-deception route, I recommend this essay and subsequent essays as steps to indoctrinate yourself.
An excellent heuristic, indeed!
It depends on the cost of overconfidence. Nothing ventured, nothing gained. But if the expected cost of venturing wrongly is greater than the expected return, it's better to be careful what you attempt. If the potential loss is great enough, cautiousness is a virtue. If there's little investment to lose, cautiousness is a vice.
Right.
IMO John_Maxwell_IV described the benefits of confidence quite well. For the other side see my post where I explicitly asked people what benefit they derive from the OB/LW Art of Rationality in its current state. Sorry to say, there weren't many concrete answers. Comments went mostly along the lines of "well, no tangible benefits for me, but truth-seeking is so wonderful in itself". If you can provide a more convincing answer, please do.