I used to have an adage to the effect that if you walk away from an argument feeling like you've processed it before a month has passed, you're probably kidding yourself. I'm not sure I would take such a strong line nowadays, but it's a useful prompt to bear in mind. Might or might not be related to another thing I sometimes say, that it takes at least a month to even begin establishing a habit. While a perfect reasoner might consider all hypotheses in advance or be able to use past data to test new hypotheses, in practice it seems to me that being on the lookout for evidence for or against a new idea is often necessary to give the idea a fair shake, which feels like a very specific case of noticing (namely, noticing when incoming information bears on some new idea you heard and updating).
When I state a position and offer evidence for it, people sometimes complain that the evidence that I've given doesn't suffice to establish my position. The situation is usually that I'm not trying to give a rigorous argument for my position, and I don't intend to claim that the evidence that I provide suffices to establish my position.
My goal in these cases is to offer a high-level summary of my thinking, and to provide enough evidence so that readers have reason to Bayesian update and to find the view sufficiently intriguing to investigate further.
In general, when a position is non-obvious, a single conversation is nowhere near enough time to convince a rational person that it's very likely to be true. As Burgundy recently wrote:
When you ask Carl Shulman a question on AI, and he starts giving you facts instead of a straight answer, he is revealing part of his book. The thing you are hearing from Carl Shulman is really only the tip of the iceberg because he cannot talk fast enough. His real answer to your question involves the totality of his knowledge of AI, or perhaps the totality of the contents of his brain.
If I were to restrict myself to making claims that I could substantiate in a mere ~2 hours, that would preclude the possibility of me sharing the vast majority of what I know.
In math, one can give rigorous proofs starting from very simple axioms, as Gauss described:
I mean the word proof not in the sense of lawyers, who set two half proofs equal to a whole one, but in the sense of mathematicians, where 1/2 proof = 0, and it is demanded for proof that every doubt becomes impossible'.
Even within math, as a practical matter, proofs that appear to be right are sometimes undercut by subtle errors. But outside of math – the only reliable tool that one has at one's disposal is Bayesian inference. In 2009, charity evaluator GiveWell made very strong efforts to apply careful reasoning to identify its top rated charity, and gave a "conservative" cost-effectiveness estimate of $545/life saved, which turned out to have been wildly optimistic. Argumentation that looks solid on the surface often breaks down on close scrutiny. This is closely related to why GiveWell emphasizes the need to look at giving opportunities from many angles, and gives more weight to robustness of evidence than to careful chains of argumentation.
Eliezer named this website Less Wrong for a reason – one can never be certain of anything – all rational beliefs reflect degrees of confidence. I believe that discussion advances rationality the most when it involves sharing perspectives and evidence, rather than argumentation.