Assorted Projects to help people improve their epistemics and individuals to contact:
- Pastcasting - forecasting past results so you get instant feedback
Calibration - see how justified your own confidence ishttps://www.quantifiedintuitions.org/calibration
- Historical Base rates - much thinking requires knowing roughly how often things have happened in the past
Seems like Our World In Data might do some work hereThere is at least one other effort
Increasing summarisationPrizes for summarization (by Nonlinear) - making it easier to get a basic understanding on EA/LessWrong topics
Potential projects:
- Displaying estimates
The – the Squiggle team seem focused on the first step of thisNathan Young is interested in it
Note:
It is hard to build epistemic infrastructure among rationalists, because anyone who is capable of doing it can work on AI safety and most do
Assorted ProjectsCalibration– see how justified your own confidence isPastcasting– forecasting past results so you get instant feedbackHistorical Base rates - much thinking requires knowing roughly how often things have happened in the past. CFOur World In DataPrizes for summarization (by Nonlinear)– making it easier to get a basic understanding on EA/LessWrong topicsDisplaying estimates – theSquiggleteam seem focused on the first step of this