Wiki Contributions

Comments

Sorted by
mathyouf1-2

I had some reasons I didn't pick the ones you included.

  • Willingness to change mind when presented with evidence

I could see someone rationally taking an anti-dialectical stance if they think that the evidence they are being given is somehow not valid or biased such as to be an example of bayesian persuasion.

  • Interest in improving reasoning and decision-making skills

Someone could be committed to the goal of discovering truth, but also not be able to currently prioritize it, and so they may have no interest in improvement at the time.

  • Commitment to intellectual honesty

I believe one could rationally decide that the best way to achieve a goal of truth discovery might not involve intellectual honesty in a given circumstance.

when you think of something that makes it seem more likely that bond prices will go up, then you feel less likely to need an excuse for bond prices going down or remaining the same


Would this require more excuses?

If listeners end up in a world where something likely happens, they will need fewer explanations and the explanations will need to be less high quality to convince them. If a low probability event happens then it's your time to shine with a very convincing and maybe hard to generate explanation, which might demand more time. If your job is on the line, and you need to perform well, good performance will demand potentially more time dedicated to generating the harder to generate explanations for the low likelihood events.

Communism at the turn of the mid-Century was a question with an objective correct answer (no).

This was a question that seemed to be more often than not answered incorrectly.

But it required some degree of intelligence to even understand it.

Smart people got it wrong more often than others.

I wonder what other bad ideas that intelligent beings are also more likely than not to get wrong exist, outside the scope of human comprehension.