Epistemic and Instrumental Tradeoffs
Related: What Do We Mean By "Rationality?"
Epistemic rationality and instrumental rationality are both useful. However, some things may benefit one form of rationality yet detract from another. These tradeoffs are often not obvious, but can have serious consequences.
For instance, take the example of learning debate skills. While involved in debate in high school, I learned how to argue a position quite convincingly, muster strong supporting evidence, prepare rebuttals for counterarguments, prepare deflections for counterarguments that are difficult to rebut, and so on.
I also learned how to do so regardless of what side of a topic I was assigned to.
My debate experience has made me a more convincing and more charismatic person, improved my public speaking skills, and bolstered my ability to win arguments. Instrumentally speaking, this can be a very useful skillset. Epistemically speaking, this sort of preparation is very dangerous, and I later had to unlearn many of these thought patterns in order to become better at finding the truth.
For example, when writing research papers, the type of motivated cognition used when searching for evidence to bolster a position in a debate is often counterproductive. Similarly, when discussing what the best move for my business to make is, the ability to argue convincingly for a position regardless of whether it is right is outright dangerous, and lessons learned from debate may actually decrease the odds of making the correct decision-- if I'm wrong but convincing and my colleagues are right but unconvincing, we could very well end up going down the wrong path!
Epistemic and instrumental goals may also conflict in other ways. For instance, Kelly (2003)[1] points out that, from an epistemic rationality perspective, learning movie spoilers is desirable, since they will improve your model of the world. Nevertheless, many people consider spoilers to be instrumentally negative, since they prefer the tension of not knowing what will happen while they watch a movie.
Bostrom (2011)[2] describes many more situations where having a more accurate model of the world can be hazardous to various instrumental objectives. For instance, knowing where the best parties are held on campus can be a very useful piece of knowledge to have in many contexts, but can become a distracting temptation when you're writing your thesis. Knowing that one of your best friends has just died can be very relevant to your model of the world, but can also cause you to become dangerously depressed. Knowing that Stalin's wife didn't die from appendicitis can be useful for understanding certain motivations, but can be extraordinarily dangerous to know if the secret police come calling.
Thus, epistemic and instrumental rationality can in some cases come into conflict. Some instrumental skillsets might be better off neglected for reasons of epistemic hygeine; similarly, some epistemic ventures might yield information that it would be instrumentally better not to know. When developing rationality practices and honing one's skills, we should take care to acknowledge these tradeoffs and plan accordingly.
[1] Kelly, T., (2003). Epistemic Rationality as Instrumental Rationality: A Critique. Philosophy and Phenomenological Research, 66(3), pp. 612-640.
[2] Bostrom, N., (2011). Information Hazards: A Typology of Harms from Knowledge. Review of Contemporary Philosophy, 10, pp. 44-79.
Don't Get Offended
Related to: Politics is the Mind-Killer, Keep Your Identity Small
Followed By: How to Not Get Offended
One oft-underestimated threat to epistemic rationality is getting offended. While getting offended by something sometimes feels good and can help you assert moral superiority, in most cases it doesn't help you figure out what the world looks like. In fact, getting offended usually makes it harder to figure out what the world looks like, since it means you won't be evaluating evidence very well. In Politics is the Mind-Killer, Eliezer writes that "people who would be level-headed about evenhandedly weighing all sides of an issue in their professional life as scientists, can suddenly turn into slogan-chanting zombies when there's a Blue or Green position on an issue." Don't let yourself become one of those zombies-- all of your skills, training, and useful habits can be shut down when your brain kicks into offended mode!
One might point out that getting offended is a two-way street and that it might be more appropriate to make a post called "Don't Be Offensive." That feels like a just thing to say-- as if you are targeting the aggressor rather than the victim. And on a certain level, it's true-- you shouldn't try to offend people, and if you do in the course of a normal conversation it's probably your fault. But you can't always rely on others around you being able to avoid doing this. After all, what's offensive to one person may not be so to another, and they may end up offending you by mistake. And even in those unpleasant cases when you are interacting with people who are deliberately trying to offend you, isn't staying calm desirable anyway?
The other problem I have with the concept of being offended as victimization is that, when you find yourself getting offended, you may be a victim, but you're being victimized by yourself. Again, that's not to say that offending people on purpose is acceptable-- it obviously isn't. But you're the one who gets to decide whether or not to be offended by something. If you find yourself getting offended to things as an automatic reaction, you should seriously evaluate why that is your response.
There is nothing inherent in a set of words that makes them offensive or inoffensive-- your reaction is an internal, personal process. I've seen some people stay cool in the face of others literally screaming racial slurs in their faces and I've seen other people get offended by the slightest implication or slip of the tongue. What type of reaction you have is largely up to you, and if you don't like your current reactions you can train better ones-- this is a core principle of the extremely useful philosophy known as Stoicism.
Of course, one (perhaps Robin Hanson) might also point out that getting offended can be socially useful. While true-- quickly responding in an offended fashion can be a strong signal of your commitment to group identity and values[1]-- that doesn't really relate to what this post is talking about. This post is talking about the best way to acquire correct beliefs, not the best way to manipulate people. And while getting offended can be a very effective way to manipulate people-- and hence a tactic that is unfortunately often reinforced-- it is usually actively detrimental for acquiring correct beliefs. Besides, the signalling value of offense should be no excuse for not knowing how not to be offended. After all, if you find it socially necessary to pretend that you are offended, doing so is not exactly difficult.
Personally, I have found that the cognitive effort required to build a habit of not getting offended pays immense dividends. Getting offended tends to shut down other mental processes and constrain you in ways that are often undesirable. In many situations, misunderstandings and arguments can be diminished or avoided completely if one is unwilling to become offended and practiced in the art of avoiding offense. Further, some of those situations are ones in which thinking clearly is very important indeed! All in all, while getting offended does often feel good (in a certain crude way), it is a reaction that I have no regrets about relinquishing.
[1] In Keep Your Identity Small, Paul Graham rightly points out that one way to prevent yourself from getting offended is to let as few things into your identity as possible.
= 783df68a0f980790206b9ea87794c5b6)
Subscribe to RSS Feed
= f037147d6e6c911a85753b9abdedda8d)