Jonathan_Graehl comments on PredictionBook.com - Track your calibration - Less Wrong
You are viewing a comment permalink. View the original post to see all comments and the full post content.
You are viewing a comment permalink. View the original post to see all comments and the full post content.
Comments (52)
Do you have any evidence for this? I don't remember any strongly domain-specific results in Tetlock's study, the book I read about calibration in business, or any studies. Nor does Wikipedia mention anything except domain experts being overconfident (as opposed to people being random outside their domain even when supposedly calibrated, as you imply), which is fixable with calibration training.
And this is what I would expect given that the question is not about accuracy (one would hope experts would win in a particular domain) but about calibration - why can't one accurately assess, in general, one's ignorance?
(I have >1100 predictions registered on PB.com and >=240 judged so far; I can't say I've noticed any especial domain-related correlations.)
p.s. that's a lot of predictions :)
How many would you have thought gwern had?
I found this question puzzling, and difficult to answer (I'm sleep deprived). Funny joke if you were sneakily trying to get me to make a prediction.
Unfortunately I'm pretty well anchored now.
I'd expect LW-haunters who decide to make predictions at PB.com to make 15 on the first day and 10 in the next year (with a mode of 0).