dspeyer comments on Rationality Quotes September 2014 - Less Wrong

8 Post author: jaime2000 03 September 2014 09:36PM

You are viewing a comment permalink. View the original post to see all comments and the full post content.

Comments (379)

You are viewing a single comment's thread.

Comment author: dspeyer 03 September 2014 05:06:19PM 75 points [-]

A good rule of thumb might be, “If I added a zero to this number, would the sentence containing it mean something different to me?” If the answer is “no,” maybe the number has no business being in the sentence in the first place.

Randall Munroe on communicating with humans

Comment author: Viliam_Bur 05 September 2014 01:40:21PM 10 points [-]

Related: When (Not) To Use Probabilities:

I would advise, in most cases, against using non-numerical procedures to create what appear to be numerical probabilities. Numbers should come from numbers. (...) you shouldn't go around thinking that, if you translate your gut feeling into "one in a thousand", then, on occasions when you emit these verbal words, the corresponding event will happen around one in a thousand times. Your brain is not so well-calibrated.

This specific topic came up recently in the context of the Large Hadron Collider (...) the speaker actually purported to assign a probability of at least 1 in 1000 that the theory, model, or calculations in the LHC paper were wrong; and a probability of at least 1 in 1000 that, if the theory or model or calculations were wrong, the LHC would destroy the world.

I object to the air of authority given these numbers pulled out of thin air. (...) No matter what other physics papers had been published previously, the authors would have used the same argument and made up the same numerical probabilities

Comment author: dspeyer 05 September 2014 04:10:03PM *  17 points [-]

For the opposite claim: If It’s Worth Doing, It’s Worth Doing With Made-Up Statistics:

Remember the Bayes mammogram problem? The correct answer is 7.8%; most doctors (and others) intuitively feel like the answer should be about 80%. So doctors – who are specifically trained in having good intuitive judgment about diseases – are wrong by an order of magnitude. And it “only” being one order of magnitude is not to the doctors’ credit: by changing the numbers in the problem we can make doctors’ answers as wrong as we want.

So the doctors probably would be better off explicitly doing the Bayesian calculation. But suppose some doctor’s internet is down (you have NO IDEA how much doctors secretly rely on the Internet) and she can’t remember the prevalence of breast cancer. If the doctor thinks her guess will be off by less than an order of magnitude, then making up a number and plugging it into Bayes will be more accurate than just using a gut feeling about how likely the test is to work. Even making up numbers based on basic knowledge like “Most women do not have breast cancer at any given time” might be enough to make Bayes Theorem outperform intuitive decision-making in many cases.

I tend to side with Yvain on this one, at least so long as your argument isn't going to be judged by its appearence. Specifically on the LHC thing, I think making up the 1 in 1000 makes it possible to substantively argue about the risks in a way that "there's a chance" doesn't.

Comment author: RichardKennaway 14 September 2014 06:48:40AM 4 points [-]

A detailed reading provides room for these to coexist. Compare:

If I added a zero to this number

with

off by less than an order of magnitude

Comment author: [deleted] 14 September 2014 03:23:15PM 2 points [-]

I'd agree with Randall Monroe more wholeheartedly if he had said “added a couple of zeros” instead.