If you're a human and you want to have correct beliefs, you must make a special effort to seek evidence that your beliefs are wrong. One of our known defects is our tendency to stick with our beliefs for too long. But if you do this successfully, you will become less certain and therefore less determined.
Normatively, seeking disconfirmation and not finding it should make you more certain. And if you do become less certain, I'm not convinced this necessarily makes you less determined – why couldn't it heighten your curiosity, or (especially if you have...
Where do you find in that link the suggestion that rationalists should be less confident?
One who sees that people generally overestimate themselves, and responds by downgrading their own self-confidence, imitates the outward form of the art without the substance.
One who seeks only to destroy their beliefs practices only half the art.
The rationalist is precisely as confident as the evidence warrants. But if he has too little evidence to vanquish his priors, he does not sit content with precisely calibrated ignorance. If the issue matters to him, he must s...
I'm not aware of studies showing that those in the upper 10% overestimate their abilities. Anyone trying to increase their rationality is probably in the upper 10% already.
My recollection is that at least one study showed some regression to the mean in confidence -- highly-skilled people tended to underestimate themselves.
Who do you think is going to be more motivated to think about math: someone who feels it is their duty to become smarter, or a naive student who believes he or she has the answer to some mathematical problem and is only lacking a proof?
Or, how about the student who believes they may have the answer, and has a burning itch to know whether this is the case? Or the one with something to protect?
Who do you think is going to be more motivated to think about math: someone who feels it is their duty to become smarter, or a naive student who believes he or she has the answer to some mathematical problem and is only lacking a proof?
Or, how about the student who believes they may have the answer, and has a burning itch to know whether this is the case?
(Really, though, it's going to be the one with something to protect.)
I'd say the benefits have to outweigh the costs. If you succeed in achieving your goal despite holding a significant number of false beliefs relevant to this goal, it means you got lucky: Your success wasn't caused by your decisions, but by circumstances that just happened to be right.
That the human brain is wired in such a way that self-deception gives us an advantage in some situations may tip the balance a little bit, but it doesn't change the fact that luck only favors us a small fraction of the time, by definition.
If there is no known way to correct for a bias, the thing to do is to find one. Swerving an arbitrary amount in the right direction will not do -- reversed stupidity etc.
I once saw a poster in a chemist's shop bluntly asserting, "We all eat too much salt." What was I supposed to do about that? No matter how little salt I take in, or how far I reduce it, that poster would still be telling me the same thing. No, the thing to do, if I think it worth attending to, would be to find out my actual salt intake and what it should actually be. Then "surrender to the truth" and confidently do what the result of that enquiry tells me.
If someone finds it hard to do what they believe that they should and can, then their belief is mistaken, or at least incomplete. They have other reasons for not doing whatever it is, reasons that they are probably unaware of when they merely fret about what they ought to be doing. Compelling oneself is unnecessary when there is nothing to overcome. The root of indecision is conflict, not doubt; irrationality, not rationality.
Here's a quote about rationality in action from a short story recently mentioned on LW, a classic of SF that everyone with an interest in rationality should read. I find that a more convincing picture than one of supine doubt.
Swerving an arbitrary amount in the right direction will not do -- reversed stupidity etc.
Reversing stupidity is not the same thing as swerving an arbitrary amount in the right direction. And the amount is not arbitrary: like most of my belief changes, it is based on my intuition. This post by Robin Hanson springs to mind; see the last sentence before the edit.
Anyway, some positive thoughts I have about myself are obviously unwarranted. I'm currently in the habit of immediately doubting spontaneous positive thoughts (because of what I've read about o...
Consider the problem of an agent who is offered a chance to improve their epistemic rationality for a price. What is such an agent's optimal strategy?
A complete answer to this problem would involve a mathematical model to estimate the expected increase in utility associated with having more correct beliefs. I don't have a complete answer, but I'm pretty sure about one thing: From an instrumental rationalist's point of view, to always accept or always refuse such offers is downright irrational.
And now for the kicker: You might be such an agent.
One technique that humans can use to work towards epistemic rationality is to doubt themselves, since most people think they are above average in a wide variety of areas (and it's reasonable to assume that merit in at least some of these areas is normally distributed.) But having a negative explanatory style, which is one way to doubt yourself, has been linked with sickness and depression.
And the inverse is also true. Humans also seem to be rewarded for a certain set of beliefs: those that help them maintain a somewhat-good assessment of themselves. Having an optimistic explanatory style (in an nutshell, explaining good events in a way that makes you feel good, and explaining bad events in a way that doesn't make you feel bad) has been linked with success in sports, sales and school.
If you're unswayed by my empirical arguments, here's a theoretical one. If you're a human and you want to have correct beliefs, you must make a special effort to seek evidence that your beliefs are wrong. One of our known defects is our tendency to stick with our beliefs for too long. But if you do this successfully, you will become less certain and therefore less determined.
In some circumstances, it's good to be less determined. But in others, it's not. And to say that one should always look for disconfirming evidence, or that one should always avoid looking for disconfirming evidence, is idealogical according to the instrumental rationalist.
Who do you think is going to be more motivated to think about math: someone who feels it is their duty to become smarter, or a naive student who believes he or she has the answer to some mathematical problem and is only lacking a proof?
You rarely see a self-help book, entreprenuership guide, or personal development blog telling people how to be less confident. But that's what an advocate of rationalism does. The question is, do the benefits outweigh the costs?