mattnewport comments on The Irrationality Game - Less Wrong

38 Post author: Will_Newsome 03 October 2010 02:43AM

You are viewing a comment permalink. View the original post to see all comments and the full post content.

Comments (910)

You are viewing a single comment's thread. Show more comments above.

Comment author: Vladimir_M 03 October 2010 07:45:28PM *  2 points [-]

komponisto:

Are you sure you're not just worried about poor calibration?

No, my objection is fundamental. I provide a brief explanation in the comment I linked to, but I'll restate it here briefly.

The problem is that the algorithms that your brain uses to perform common-sense reasoning are not transparent to your conscious mind, which has access only to their final output. This output does not provide a numerical probability estimate, but only a rough and vague feeling of certainty. Yet in most situations, the output of your common sense is all you have. There are very few interesting things you can reason about by performing mathematically rigorous probability calculations (and even when you can, you still have to use common sense to establish the correspondence between the mathematical model and reality).

Therefore, there are only two ways in which you can arrive at a numerical probability estimate for a common-sense belief:

  1. Translate your vague feeling of certainly into a number in some arbitrary manner. This however makes the number a mere figure of speech, which adds absolutely nothing over the usual human vague expressions for different levels of certainty.

  2. Perform some probability calculation, which however has nothing to do with how your brain actually arrived at your common-sense conclusion, and then assign the probability number produced by the former to the latter. This is clearly fallacious.

Honestly, all this seems entirely obvious to me. I would be curious to see which points in the above reasoning are supposed to be even controversial, let alone outright false.

Comment author: mattnewport 03 October 2010 08:00:14PM 4 points [-]

It seems plausible to me that routinely assigning numerical probabilities to predictions/beliefs that can be tested and tracking these over time to see how accurate your probabilities are (calibration) can lead to a better ability to reliably translate vague feelings of certainty into numerical probabilities.

There are practical benefits to developing this ability. I would speculate that successful bookies and professional sports bettors are better at this than average for example and that this is an ability they have developed through practice and experience. Anyone who has to make decisions under uncertainty seems like they could benefit from a well developed ability to assign well calibrated numerical probability estimates to vague feelings of certainty. Investors, managers, engineers and others who must deal with uncertainty on a regular basis would surely find this ability useful.

I think a certain degree of skepticism is justified regarding the utility of various specific methods for developing this ability (things like predictionbook.com don't yet have hard evidence for their effectiveness) but it certainly seems like it is a useful ability to have and so there are good reasons to experiment with various methods that promise to improve calibration.

Comment author: Vladimir_M 03 October 2010 08:28:13PM -2 points [-]

I addressed this point in another comment in this thread:

http://lesswrong.com/lw/2sl/the_irrationality_game/2qgm

Comment author: mattnewport 03 October 2010 08:44:50PM *  3 points [-]

I agree with most of what you're saying (in that comment and this one) but I still think that the ability to give well calibrated probability estimates for a particular prediction is instrumentally useful and that it is fairly likely that this is an ability that can be improved with practice. I don't take this to imply anything about humans performing actual Bayesian calculations either implicitly or explicitly.