I don't know of a full guide, but here's a sequence exploring applications for several CFAR techniques: https://www.lesswrong.com/sequences/qRxTKm7DAftSuTGvj
At risk of stating the obvious, have you considered attending a CFAR workshop in person?
I found them to be a really great experience, and now that they have started organizing events in Europe they are more accessible than ever!
Check out their page.
https://www.lesswrong.com/posts/pjGGqmtqf8vChJ9BR/unofficial-canon-on-applied-rationality
I have a few dojos published on my own site like this one about Zen koans: http://bearlamp.com.au/zen-koans/
https://www.lesswrong.com/posts/pjGGqmtqf8vChJ9BR/unofficial-canon-on-applied-rationality
I have a few dojos published on my own site like this one about Zen koans: http://bearlamp.com.au/zen-koans/
We can identify places we know (inductively) tend to lead us astray and even identify tricks that help us avoid being affected by common fallacies which often aflict humans. However, it's not at all clear if this actually makes us more rational in any sense.
If you mean act-rationality we'd have to study if this was a good life choice. If you mean belief rationality you'd have to specify some kind of measure/notion of importance to decide when it really matters you believed the true thing. After all if it's just maximizing the number of times you believe the truth the best way to be rational is just to memorize giant tables of dice rolls. If it's minimizing false beliefs you might want to avoid forming any at all. Even if you find some more appropriate function to maximize some beliefs obviously should count more than others. I mean you don't want to spend your time memorizing lists of dice roles and forget the facts about being killed by buses if you walk into the street.
But once you realize this point then who knows. It could be the most rational thing in the world to have a totally dogmatic, evidence irresponsive, belief in the existence of some beardy dude in the sky because it's the belief that matters the most and the rule "always believe in God Beard" will thus maximize getting important beliefs right.
I know what you mean. You mean something like avoiding the kind of fallacies that people who always talk about fallacies care about avoiding. But why should those be the most important fallacies to combat etc..
Hi everyone, I'm new to the community, and am currently working my way through the sequences — yes, all of them.
In the introduction to the first book of Rationality A-Z, Eliezer says:
Just wanted to ask if CFAR has got any of those reorganised materials up, and if they're linked to from anywhere on this site? Any links to other rationality-as-practice blog posts or books or sequences would also be incredibly appreciated!