This. I'm skeptical of almost every numerical probability estimate I hear unless the steps are outlined to me.
This may be one reason why people are reluctant to assign numbers to beliefs in the first place. People equate numbers with certainty and authority, whereas a probability is just a way of saying how uncertain you are about something.
When giving a number for a subjective probability, I often feel like it should be a two-dimensional quantity: probability and authority. The "authority" figure would be an estimate of "if you disagree with me now but we manage to come to an agreement in the next 5 minutes, what are the chances of me having to update my beliefs versus you?"
I'm trying to develop a large set of elevator pitches / elevator responses for the two major topics of LW: rationality and AI.
An elevator pitch lasts 20-60 seconds, and is not necessarily prompted by anything, or at most is prompted by something very vague like "So, I heard you talking about 'rationality'. What's that about?"
An elevator response is a 20-60 second, highly optimized response to a commonly heard sentence or idea, for example, "Science doesn't know everything."
Examples (but I hope you can improve upon them):
"So, I hear you care about rationality. What's that about?"
"Science doesn't know everything."
"But you can't expect people to act rationally. We are emotional creatures."
"But sometimes you can't wait until you have all the information you need. Sometimes you need to act right away."
"But we have to use intuition sometimes. And sometimes, my intuitions are pretty good!"
"But I'm not sure an AI can ever be conscious."
Please post your own elevator pitches and responses in the comments, and vote for your favorites!