In Never Split The Difference: Negotiating As If Your Life Depended On It, Chris Voss discusses cognitive biases:
Through decades of research with Tversky, Kahneman proved that humans all suffer from Cognitive Bias, that is unconscious—and irrational—brain processes that literally distort the way we see the world. Kahneman and Tversky discovered more than 150 of them.
There’s the Framing Effect, which demonstrates that people respond differently to the same choice depending on how it is framed (people place greater value on moving from 90 percent to 100 percent—high probability to certainty—than from 45 percent to 55 percent, even though they’re both ten percentage points) (p. 12).
Isn’t it rational to value 90% → 100% more than 45% → 55%?
Even going from 90% to 95% means you are wrong half as often — instead of — whereas going from 45% to 55% only removes about 20% of your errors.
Is my thinking and math correct? If not, how am I wrong?
Assuming I’m right, I would also really appreciate a better way to explain this.
If you want to prove a bunch of theorems involving continuities and infinities, treating 0 and 1 as probabilities is much more elegant and things mostly fall apart without them, yes.
If your goal is to reason under uncertainty, thinking in terms of odds ratios and decibels is a way of putting your map in close correspondence with the territory. Allowing for infinities in this use case introduces complications and weird philosophical questions about the (in)finiteness of reality.
On earth, most people start out by learning probability theory in terms of probabilities, for the purpose of solving math problems or proving theorems in school. Later (if they stumble across the right kinds of blogs) they learn probability as a reasoning tool, but often forget or don't realize that thinking in terms of odds ratios when using probability for this purpose is much more convenient once you get used to it.
On a planet where people grew up studying probability as a reasoning tool first, and only as an afterthought studied it as a branch of math, someone might need to write a blog post that 0 and 1 are basically just ordinary probabilities, and sometimes probabilities are more elegant and intuitive than odds and decibels, lest people start over-complicating their proofs.
I don't see anything wrong or contradictory with pointing out the difference between probability as mathematical theory and probability as reasoning method.