Epistemic status: Relatively certain, given that the technique is simple and should align well with existing rationality practices. What's presented here is just a reframing of existing frameworks.

Here's a proposition: Most of epistemic rationality consists of the liberal application of trigger phrases like "not necessarily", followed by an exploration of the implications.

The majority of logical fallacies and cognitive biases boil down to our brains tricking us to being overly certain of things. Learning about these errors is a vital foundation to rationality and can't be bypassed. That knowledge remains dormant until it's activated by noticing certainty in the environment, calling it out, and sifting through your mental library for applicable techniques.

It's the second part of rationality - activating your sense of skepticism - that can trip a lot of people up. One way to help with this is to get in the habit of using helpful trigger phrases whenever possible, since they're naturally associated with critically evaluating what has been said.

"Not necessarily" is my favorite, since it's applicable to so many situations, but there are others. "It depends" works with some questions, especially if they imply choosing between two options. "Are we sure?" invites us to evaluate the validity of evidence, and "So what?" examines the connection between evidence and a conclusion it may or may not support.

This is all well and good, but let's look at some examples. I've organized them by the applicable bias or fallacy.

  • False dilemma - "You're either with us or against us." Not necessarily. I could take actions that help both sides, or find a way to abstain from the conflict completely.
  • Post hoc ergo propter hoc - "I went in the rain and got sick, so being in the rain must lead to illness." Not necessarily. It could very easily be a coincidence, especially since we've been in the rain many times and not gotten sick.
  • Anecdotal evidence - "Steve Jobs and several other famous people dropped out from college." So what? Those people had resources and lots of random chance on their side. Research suggests that most dropouts don't fare as well.
  • Confirmation bias - "A small study came out that suggests that vaccines cause autism. I knew I was right." Are we sure? That study had a very small sample size, and there is a vast body of research that supports the opposite conclusion.
  • Exception fallacy - "Women must all be emotionally weak; my girlfriend cries all the time." So what? That is a sample size of one. We will need to look at additional research to see if that's really the case.
  • Improper appeal to authority - "My father says that (insert ethnicity here) is to blame for all the world's problems, so it must be true." Not necessarily. Many people have similar beliefs - your father's opinions shouldn't hold any special weight. We can examine more objective evidence and find that his theories don't pan out.

Be aware that sometimes, the original claim will be correct, even after applying skepticism. "Everyone says the world is round, so it must be." Not necessarily. There are many instances where the majority believes something incorrect; we can't fall prey to the bandwagon fallacy. After looking into the evidence, however, it becomes clear that the world is round after all. Stepping into your skeptic's shoes is a good exercise, but it doesn't always lead to a change in your beliefs.

What are your thoughts? Do you have any trigger phrases you would add to the list?

New Comment
2 comments, sorted by Click to highlight new comments since:

I like this line of reasoning and will reflect more on what I'll dub 'reflection triggers.'

Blimey, I thought it was a bug of mine.

(I kinda think it's a bug still. "Not necessarily" means nothing more than "not necessarily", you can't use it as a "no". And usually I want to use it as a "no", to support my own point of view in some discussion. So - handy, but requires caution.)