I want to know what are good rationality exercises.
I was just on a call with Liron and PhilH, hanging out after the weekly LessWrong weekend event, and we discussed exercises that could happen on LessWrong.
Here is the list we generated:
- Thinking Physics
- Fermi Estimates
- Project Euler
- Calibration Training
- Basic probabilistic reasoning
- Basic have-you-read-the-sequences knowledge test (e.g. "Which of the following is an example of 'belief as attire'?")
Another user on the call (whose name I forget) suggested it could be fun to have a daily Fermi Estimate on LessWrong, where everyone submits their number and the model they used to reach the number. I think this would be quite exciting.
Please write answers with other exercises that you think are or might be great for rationality training, some explanation of why you think it could be good, and a suggestion of how it could be incorporated into LessWrong. I'll probably add some of the above myself.
When I first read the sequences, I thought "What do I know and how do I think I know it?" was pretty banal and useless -- didn't everyone know that? Philosophy 101, question your beliefs, look for hidden assumptions, etc.
The older I get the more I come to think that no, not everyone knows this, and even the people who know it don't practice it enough. I'm not sure though.
I think of "What do I know and how do I think I know it?" as the "root cause" of essentially all other epistemic rationality - i.e. if you're sufficiently good at that one skill, all the others will follow naturally from it. Conversely, that suggests it's really difficult to get really good at it: if I'm missing any other epistemic rationality skill, it means I'm not good enough at "What do I know and how do I think I know it?".
I'd say the "obvious" version of the skill involves activities which look like questioning beliefs, looking for hidden assumptions... (read more)