Ishaan comments on Three ways CFAR has changed my view of rationality - Less Wrong
You are viewing a comment permalink. View the original post to see all comments and the full post content.
You are viewing a comment permalink. View the original post to see all comments and the full post content.
Comments (58)
Instrumental and epistemic rationality were always kind of handwavey, IMO. For example, if you want to achieve your goals, it often helps to have money. So if I deposit $10,000 in your bank account, does that make you more instrumentally rational?
You could define instrumental rationality as "mental skills that help people better achieve their goals". Then I could argue that learning graphic design makes you more instrumentally rational, because it's a mental skill and if you learn it, you'll be able to make money from anywhere using your computer, which is often useful for achieving your goals.
You could define epistemic rationality as "mental skills that help you know what's true". Then I could argue that learning about chess makes you more epistemically rational, because you can better know the truth of statements about who's going to win chess games that are in progress.
I like the idea of thinking of rationality in terms of mental skills that are very general in the sense that they can be used by many different people in many different situations, kind of like how Paul Graham defines "philosophy". "Mental skills that are useful to many people in many situations" seems like it should have received more study as a topic by now... I guess maybe people have developed memetic antibodies towards anything that sounds too good to be true in that way? (In this case, the relevant antibodies would have been developed thanks to the self-help industry?)
I don't think they are hand-wavy. I maintain that they are extremely well-defined terms, at least when you are speaking of idealized agents. Here are some counter-points:
No. Instrumental rationality is about choosing the optimal action, not having nice things happen to you. Take away the element of choice, and there is no instrumental rationality. I've got to cause you to drop the money in my account for you to call it to instrumental rationality.
No, because "learning about chess" is an action. Choosing where to look for evidence is an action. You'd be instrumentally (ir)rational to (not) seek out information about chess, depending on goals and circumstance.
Epistemic rationality is what you do with evidence after acquiring it, not the process of acquiring evidence. It describes your effectiveness at learning the rules of chess given that you have the relevant info. It doesn't describe your choice to go out and acquire chess learning info. If you were strapped to a chair and made to watch chess (or casually observed it) and failed to make rational guesses concerning the underlying rules, then you failed at epistemic rationality.
Same for learning about Bayes' rule.
Learning about Bayes' rule improves one's epistemic rationality; I'm arguing that learning about chess does the same.
I guess this is the point where humans and theoretical rational agents diverge. Rational agents don't learn rationality - it's just assumed that they come pre-wired with all the correct mathematics and philosophy required to make optimal choices for all possible games.
But on the human side, I still don't think that's really a valid comparison. Being able to use Bayes' rule improves rationality in the general case. It falls under the heading of "philosophy, epistemology, mathematics".
Chess just gives you knowledge about a specific system. It falls under the heading of "science, inference, evidence".
There's a qualitative difference between the realm of philosophy and mathematics and the realm of reality and observation.