Jonathan_Graehl comments on PredictionBook.com - Track your calibration - Less Wrong

34 Post author: Eliezer_Yudkowsky 14 October 2009 12:08AM

You are viewing a comment permalink. View the original post to see all comments and the full post content.

Comments (52)

You are viewing a single comment's thread. Show more comments above.

Comment author: gwern 17 August 2011 10:36:09PM *  1 point [-]

Do you have any evidence for this? I don't remember any strongly domain-specific results in Tetlock's study, the book I read about calibration in business, or any studies. Nor does Wikipedia mention anything except domain experts being overconfident (as opposed to people being random outside their domain even when supposedly calibrated, as you imply), which is fixable with calibration training.

And this is what I would expect given that the question is not about accuracy (one would hope experts would win in a particular domain) but about calibration - why can't one accurately assess, in general, one's ignorance?

(I have >1100 predictions registered on PB.com and >=240 judged so far; I can't say I've noticed any especial domain-related correlations.)

Comment author: Jonathan_Graehl 18 August 2011 06:02:25AM 0 points [-]

p.s. that's a lot of predictions :)

Comment author: lessdazed 18 August 2011 07:10:30AM 0 points [-]

How many would you have thought gwern had?

Comment author: Jonathan_Graehl 18 August 2011 07:23:32AM 1 point [-]

I found this question puzzling, and difficult to answer (I'm sleep deprived). Funny joke if you were sneakily trying to get me to make a prediction.

Unfortunately I'm pretty well anchored now.

I'd expect LW-haunters who decide to make predictions at PB.com to make 15 on the first day and 10 in the next year (with a mode of 0).