If you're a human and you want to have correct beliefs, you must make a special effort to seek evidence that your beliefs are wrong. One of our known defects is our tendency to stick with our beliefs for too long. But if you do this successfully, you will become less certain and therefore less determined.
Normatively, seeking disconfirmation and not finding it should make you more certain. And if you do become less certain, I'm not convinced this necessarily makes you less determined – why couldn't it heighten your curiosity, or (especially if you have...
Where do you find in that link the suggestion that rationalists should be less confident?
One who sees that people generally overestimate themselves, and responds by downgrading their own self-confidence, imitates the outward form of the art without the substance.
One who seeks only to destroy their beliefs practices only half the art.
The rationalist is precisely as confident as the evidence warrants. But if he has too little evidence to vanquish his priors, he does not sit content with precisely calibrated ignorance. If the issue matters to him, he must s...
I'm not aware of studies showing that those in the upper 10% overestimate their abilities. Anyone trying to increase their rationality is probably in the upper 10% already.
My recollection is that at least one study showed some regression to the mean in confidence -- highly-skilled people tended to underestimate themselves.
Who do you think is going to be more motivated to think about math: someone who feels it is their duty to become smarter, or a naive student who believes he or she has the answer to some mathematical problem and is only lacking a proof?
Or, how about the student who believes they may have the answer, and has a burning itch to know whether this is the case? Or the one with something to protect?
Who do you think is going to be more motivated to think about math: someone who feels it is their duty to become smarter, or a naive student who believes he or she has the answer to some mathematical problem and is only lacking a proof?
Or, how about the student who believes they may have the answer, and has a burning itch to know whether this is the case?
(Really, though, it's going to be the one with something to protect.)
I'd say the benefits have to outweigh the costs. If you succeed in achieving your goal despite holding a significant number of false beliefs relevant to this goal, it means you got lucky: Your success wasn't caused by your decisions, but by circumstances that just happened to be right.
That the human brain is wired in such a way that self-deception gives us an advantage in some situations may tip the balance a little bit, but it doesn't change the fact that luck only favors us a small fraction of the time, by definition.
Consider the problem of an agent who is offered a chance to improve their epistemic rationality for a price. What is such an agent's optimal strategy?
A complete answer to this problem would involve a mathematical model to estimate the expected increase in utility associated with having more correct beliefs. I don't have a complete answer, but I'm pretty sure about one thing: From an instrumental rationalist's point of view, to always accept or always refuse such offers is downright irrational.
And now for the kicker: You might be such an agent.
One technique that humans can use to work towards epistemic rationality is to doubt themselves, since most people think they are above average in a wide variety of areas (and it's reasonable to assume that merit in at least some of these areas is normally distributed.) But having a negative explanatory style, which is one way to doubt yourself, has been linked with sickness and depression.
And the inverse is also true. Humans also seem to be rewarded for a certain set of beliefs: those that help them maintain a somewhat-good assessment of themselves. Having an optimistic explanatory style (in an nutshell, explaining good events in a way that makes you feel good, and explaining bad events in a way that doesn't make you feel bad) has been linked with success in sports, sales and school.
If you're unswayed by my empirical arguments, here's a theoretical one. If you're a human and you want to have correct beliefs, you must make a special effort to seek evidence that your beliefs are wrong. One of our known defects is our tendency to stick with our beliefs for too long. But if you do this successfully, you will become less certain and therefore less determined.
In some circumstances, it's good to be less determined. But in others, it's not. And to say that one should always look for disconfirming evidence, or that one should always avoid looking for disconfirming evidence, is idealogical according to the instrumental rationalist.
Who do you think is going to be more motivated to think about math: someone who feels it is their duty to become smarter, or a naive student who believes he or she has the answer to some mathematical problem and is only lacking a proof?
You rarely see a self-help book, entreprenuership guide, or personal development blog telling people how to be less confident. But that's what an advocate of rationalism does. The question is, do the benefits outweigh the costs?