Perhaps it would be beneficial to make a game used for probability calibration in which players are asked questions and give answers along with their probability estimate of it being correct. The number of points gained or lost would be a function of the player’s probability estimate such that players would maximize their score by using an unbiased confidence estimate (i.e. they are wrong p proportion of the time when they say they think they are correct with probability p. I don’t know of such a function off hand, but they are used in machine learning, so they should be able to be found easily enough. This might already exist, but if not, it could be something CFAR could use.
Hey, we can deconstruct Doyle's Sherlock Holmes stories, assigning probabilities to every single inference and offering alternative explanations. Or take some other popular fiction. That might also help people who, like me, struggle with counterfactuals.
If it's worth saying, but not worth its own post (even in Discussion), then it goes here.
Notes for future OT posters:
1. Please add the 'open_thread' tag.
2. Check if there is an active Open Thread before posting a new one. (Immediately before; refresh the list-of-threads page before posting.)
3. Open Threads should be posted in Discussion, and not Main.
4. Open Threads should start on Monday, and end on Sunday.