ntuition is really good at making fairly accurate predictions without complete information, enabling us to navigate the world without having a deep understanding of it. As a result, intuition trains us to experience the feeling we understand something without examining every detail. In most situations, paying close attention to detail is unnecessary and sometimes dangerous. When learning a technical concept, every detail matters and the premature feeling of understanding stops us from examining them.
I've built a trap for myself to help mitigate this tendency:
As soon as I think I understand something, I try it.
I.e., if I'm reading a book about circuit diagrams, the moment my intuition clicks in my head and says "aha! This is how a NAND gate works!", I immediately tell that part of my brain "okay, if you're so damn smart, build one." If I'm studying linear algebra, the moment the intuition clicks in my head and says "aha! That's how an affine transformation works!", I immediately tell that part of my brain "great! let's skip to the problems section and try to answer the first 20."
Occasionally, it turns out that my intuition appears correct...
In essence, yes; but the intended effect is more psychological.
A thing I have noticed about myself, is that once the intuitive "aha!" circuit activates, I simply cannot continue paying attention to details. My brain wants to gloss over any remaining information, saying "yeah yeah I GET it already!"
Jumping straight into the action satisfies my intuition's need for novelty and immediate feedback.
Moreso, when it turns out my intuition was wrong, I feel genuine surprise - which snaps me back into a state where I'm ready to pay attention to details again!
So for me, it's less about "doing science" as it is about providing my brain with the right "flow" to keep me motivated towards the goal of actually understanding a phenomenon.
This post makes a lot of claims that are factual in nature. Many of them seem to make sense, but that doesn't mean they are true. In fact, some of them may be false; I recall seeing research showing that intuitive thinkers performed better at math / logic problems if they were word problems involving social settings, eg amount of soda to buy for a party or people sitting next to each other. Regardless of this specific claim, the general point is that an article full of factual claims should have citations. If citations are too much trouble, the writer should provide some evidence of expertise to give us a reason to believe factual claims without citations.
Frameworks and claims that make intuitive sense but are not linked to research are risky from an epistemic hygiene perspective. So I felt I had to downvote this post despite it being well written and reasonable sounding.
I like the main essay, but have one quibble:
You can find resources with some light googling
If it's trivial to find these resources, could you not include them in the OP?
Reasons why you should:
1.) You spending the ten minutes it takes to do this will prevent dozens of readers from spending a collective several hours doing the same.
2.) The resources will receive wider use because the trivial inconvenience is gone.
3.) It's more difficult for people who are not you to know exactly what you're talking about or what you're Googling for. Googling "framework for understanding people" is not very useful.
Interesting article, but do you have any empirical evidence that people's thinking styles can be divided so neatly into intuitive vs. logical?
On its face, you seem to be taking this thinking style distinction for granted.
Reflecting on this some more, is an intuitive thinker synonymous with one who primarily uses System 1 style thinking and a logical thinker synonymous with one who primarily uses System 2 style thinking? If so, it'd clarify things quite a bit (for me at least) if you made that clear in your post.
Intuition thinkers are multi-channel learners -- all senses, thoughts and emotions are used to construct a complex database of clustered knowledge to predict and understand the world.
Citation Needed.
At a first read, several comments:
Your description of the biases of intuition-dominant thinkers neatly crystallizes several massive failure modes I've seen people fall into, many, many times, failure modes that have always made me incredibly frustrated and almost irrationally angry. (I am certainly what you would call a logic-dominant thinker.) I've almost always mostly perceived such people's failures as just "oh god, this person is being horrifyingly stupid"; sometimes I've had an inkling of what precisely caused the stupidity ("not recognizing precise language" is a conclusion I've almost reached myself). Your explanation may enable me to better communicate with intuition-dominant thinkers in the future; thank you.
I am not convinced that it's easy, or even really possible, to change from one thinking style to the other. Everything else I've read suggests this sort of cognitive leaning is largely innate. Do you have anything other than your own experience to suggest otherwise?
I am having some difficulty understand the "Ignoring your emotions" section, much less seeing the use of "fixing" this "failing". (I may expand on this later, when I've had some sleep and reread it a couple of times.)
Firstly, this post is awesome.
Secondly though, this post brushes on the topic of intuition as a useful tool, something I think far too many Logic-Based types throw out without considering the practicality of. It's better not to think of it as being an substitute for logical thinking, but rather as a quick and dirty backup, for when you don't have all the information.
Intuition can occur in up to two seconds, operates almost completely below conscious awareness, and begins effecting your body immediately. Here are some excerpts from Blink, a book by Malcolm Gladwell, in which he researches how intuition works, what abilities and drawbacks it has, and what biases can effect it's overall usefulness.
In front of you are four decks of cards, two of them red and the other two blue. Each card in those four decks either wins you a sum of money or costs you some money, and your job is to turn over cards from any the decks, one at a time, in such a way that maximizes your winnings.*
Ah, a perfect opportunity to be a Logical Thinker, using careful observation and reasoning to find the ideal pattern. What path does intuition take though?
...What you don't know at the beginning however, is that
Can I ask that the title be changed to "Biases of Intuitive and Logical Thinkers"? I almost didn't read this due to the very generic title.
Sorry, this essay doesn't make sense to me. I don't understand the framework underlying it, the context in which it lives. It just looks to me like a mish-mash of generic life advice along the lines of pay attention! ("overlooking crucial details") or listen to yourself! ("ignoring their emoitions").
This is quite related to ignoring information that doesn't fit into a framework, but another common logical bias is forcing information into your framework when it doesn't fit.
My most obvious personal encounter with this was in my high school English classes, where my teacher frequently criticized me for having an "overbearing" interpretation of the text. For a perhaps more relatable example, I've known people who have just learned about status to interpret absolutely every behavior in the context of status, even when that doesn't quite fit.
I'm heavily intuition-dominant, in that I tend to minimize the use of "System 2" thinking whenever possible and make decisions based primarily on emotion. Some more patterns I've noticed:
Strategy. System 2/Logic-dominant thinking is much better for planning things out, especially when you're working with a novel situation. If you use System 1 when playing a game such as Settlers of Catan for the first time you'll have a very low chance of winning. If you use System 2 you'll generally perform better (at least in the first play.)
Decision speed
The confused commenter clearly comprehended the argument to some degree, but a few possible details were overlooked.
I think your example is bad. It's the first commenter who is confused, not the second one.
The correct formulation is "If, for any small positive number (epsilon), I can show that the difference between A and B is less than epsilon, then I have shown A and B are equal."
The first commenter screwed things up by saying "If, for any small positive number you give me (epsilon), I can show that the difference between A and B is less than epsilon, then I have shown A and B are equal." The second commenter's objection is valid.
I'm not sure if your first example "Ignoring information they cannot immediately fit into a framework" includes "sticking to an elegant, logical framework and considering cases where this does not occur to be exceptions or aberrations even when they are very common".
That's something you see quite a lot with some otherwise quite rational people: the 'if my system can't explain it, the world's wrong' attitude'. As illustrated here: http://xkcd.com/1112/
Dredging this from a deeply buried comment:
If this [the detailed planning and execution that goes into beating a raid boss in WoW] is an intuition-based approach, then I don't know what "intuition" means.
Come to think of it, I don't know what "intuition" means. Is it anything but a label stuck on processes inaccessible to consciousness that produce thoughts? Like "free will" is a label stuck on processes inacccessible to consciousness that produce decisions?
I mostly agree with this article but object to your use of the word "intuition". What you're calling "intuition" is closer to "feeling" in the Myers-Brigs sense than to intuition (in either the MB or the common sense of that word).
In particular logical people can also be very intuitive, they're just have intuitions about, say the distribution of primes, rather than about other people.
I'm not sure the first example is really an error on the part of the commenter, unless there was an implicit shared technical usage at play. The word 'any' in the quote you give below is not very clear. I knew what it meant, but only because I understood what the argument was getting at.
"If, for any small positive number you give me (epsilon), I can show that the difference between A and B is less than epsilon, then I have shown A and B are equal."
In this case, 'any' means 'if whatever number is given, the following analysis applies, the conclusi...
I believe it's more of a spectrum.
That said, I think people should drop the notion that humans are rational. We're boundedly rational, and this is balanced with logical reasoning.
It's often said in pop culture/society that being rational is somehow "better" than being emotional. I used to believe this long ago, but now I think that's bull. Emotions exist for a perfectly valid purpose, as a guide to our environment and how to interact with and control it. The fact is many humans make decisions or process information on solely emotive rather...
Any intuition-dominant thinker who's struggled with math problems or logic-dominant thinker who's struggled with small-talk knows how difficult and hopeless the experience feels like. For a long time I was an intuition thinker, then I developed a logical thinking style and soon it ended up dominating -- granting me the luxury of experiencing both kinds of struggles. I eventually learned to apply the thinking style better optimized for the problem I was facing. Looking back, I realized why I kept sticking to one extreme.
I hypothesize that one-sided thinkers develop biases and tendencies that prevent them from improving their weaker mode of thinking. These biases cause a positive feedback loop that further skews thinking styles in the same direction.
The reasons why one style might be overdeveloped and the other underdeveloped vary greatly. Genes have a strong influence, but environment also plays a large part. A teacher may have inspired you to love learning science at a young age, causing you to foster to a thinking style better for learning science. Or maybe you grew up very physically attractive and found socializing with your peers a lot more rewarding than studying after school, causing you to foster a thinking style better for navigating social situations. Environment can be changed to help develop certain thinking styles, but it should be supplementary to exposing and understanding the biases you already have. Entering an environment that penalizes your thinking style can be uncomfortable, stressful and frustrating without being prepared. (Such a painful experience is part of why these biases cause a positive feedback loop, by making us avoid environments that require the opposite thinking style.)
Despite genetic predisposition and environmental circumstances, there's room for improvement and exposing these biases and learning to account for them is a great first step.
Below is a list of a few biases that worsen our ability to solve a certain class of problems and keep us from improving our underdeveloped thinking style.
Intuition-dominant Biases
Overlooking crucial details
Details matter in order to understand technical concepts. Overlooking a word or sentence structure can cause complete misunderstanding -- a common blunder for intuition thinkers.
Intuition is really good at making fairly accurate predictions without complete information, enabling us to navigate the world without having a deep understanding of it. As a result, intuition trains us to experience the feeling we understand something without examining every detail. In most situations, paying close attention to detail is unnecessary and sometimes dangerous. When learning a technical concept, every detail matters and the premature feeling of understanding stops us from examining them.
This bias is one that's more likely to go away once you realize it's there. You often don't know what details you're missing after you've missed them, so merely remembering that you tend to miss important details should prompt you to take closer examinations in the future.
Expecting solutions to sound a certain way
The Internship has a great example of this bias (and a few others) in action. The movie is about two middle-aged unemployed salesmen (intuition thinkers) trying to land an internship with Google. Part of Google's selection process has the two men participate in several technical challenges. One challenge required the men and their team to find a software bug. In a flash of insight, Vince Vaughn's character, Billy, shouts "Maybe the answer is in the question! Maybe it has something to do with the word bug. A fly!" After enthusiastically making several more word associations, he turns to his team and insists they take him seriously.
Why is it believable to the audience that Billy can be so confident about his answer?
Billy's intuition made an association between the challenge question and riddle-like questions he's heard in the past. When Billy used his intuition to find a solution, his confidence in a riddle-like answer grew. Intuition recklessly uses irrelevant associations as reasons for narrowing down the space of possible solutions to technical problems. When associations pop in your mind, it's a good idea to legitimize those associations with supporting reasons.
Not recognizing precise language
Intuition thinkers are multi-channel learners -- all senses, thoughts and emotions are used to construct a complex database of clustered knowledge to predict and understand the world. With robust information-extracting ability, correct grammar/word-usage is, more often than not, unnecessary for meaningful communication.
Communicating technical concepts in a meaningful way requires precise language. Connotation and subtext are stripped away so words and phrases can purely represent meaningful concepts inside a logical framework. Intuition thinkers communicate with imprecise language, gathering meaning from context to compensate. This makes it hard for them to recognize when to turn off their powerful information extractors.
This bias explains part of why so many intuition thinkers dread math "word problems". Introducing words and phrases rich with meaning and connotation sends their intuition running wild. It's hard for them to find correspondences between words in the problem and variables in the theorems and formulas they've learned.
The noise intuition brings makes it hard to think clearly. It's hard for intuition thinkers to tell whether their automatic associations should be taken seriously. Without a reliable way to discern, wrong interpretations of words go undetected. For example, without any physics background, an intuition thinker may read the statement "Matter can have both wave and particle properties at once" and believe they completely understand it. Unrelated associations of what matter, wave and particle mean, blindly take precedence over technical definitions.
The slightest uncertainty about what a sentence means should raise a red flag. Going back and finding correspondence between each word and how it fits into a technical framework will eliminate any uncertainty.
Believing their level of understanding is deeper than what it is
Intuition works on an unconscious level, making intuition thinkers unaware of how they know what they know. Not surprisingly, their best tool to learn what it means to understand is intuition. The concept "understanding" is a collection of associations from experience. You may have learned that part of understanding something means being able to answer questions on a test with memorized factoids, or knowing what to say to convince people you understand, or just knowing more facts than your friends. These are not good methods for gaining a deep understanding of technical concepts.
When intuition thinkers optimize for understanding, they're really optimizing for a fuzzy idea of what they think understanding means. This often leaves them believing they understand a concept when all they've done is memorize some disconnected facts. Not knowing what it feels like to have deeper understanding, they become conditioned to always expect some amount of surprise. They can feel max understanding with less confidence than logical thinkers when they feel max understanding. This lower confidence disincentivizes intuition thinkers to invest in learning technical concepts, further keeping their logical thinking style underdeveloped.
One way I overcame this tendency was to constantly ask myself "why" questions, like a curious child bothering their parents. The technique helped me uncover what used to be unknown unknowns that made me feel overconfident in my understanding.
Logic-dominant Biases
Ignoring information they cannot immediately fit into a framework
Logical thinkers have and use intuition -- problem is they don't feed it enough. They tend to ignore valuable intuition-building information if it doesn't immediately fit into a predictive model they deeply understand. While intuition thinkers don't filter out enough noise, logical thinkers filter too much.
For example, if a logical thinker doesn't have a good framework for understanding human behavior, they're more likely to ignore visual input like body language and fashion, or auditory input like tone of voice and intonation. Human behavior is complicated, there's no framework to date that can make perfectly accurate predictions about it. Intuition can build powerful models despite working with many confounding variables.
Bayesian probability enables logical thinkers to build predictive models from noisy data without having to use intuition. But even then, the first step of making a Bayesian update is data collection.
Combatting this tendency requires you to pay attention to input you normally ignore. Supplement your broader attentional scope with a researched framework as a guide. Say you want to learn how storytelling works. Start by grabbing resources that teach storytelling and learn the basics. Out in the real-world, pay close attention to sights, sounds, and feelings when someone starts telling a story and try identifying sensory input to the storytelling elements you've learned about. Once the basics are subconsciously picked up by habit, your conscious attention will be freed up to make new and more subtle observations.
Ignoring their emotions
Emotional input is difficult to factor, especially because you're emotional at the time. Logical thinkers are notorious for ignoring this kind of messy data, consequently starving their intuition of emotional data. Being able to "go with your gut feelings" is a major function of intuition that logical thinkers tend to miss out on.
Your gut can predict if you'll get along long-term with a new SO, or what kind of outfit would give you more confidence in your workplace, or if learning tennis in your free time will make you happier, or whether you prefer eating a cheeseburger over tacos for lunch. Logical thinkers don't have enough data collected about their emotions to know what triggers them. They tend to get bogged down and mislead with objective, yet trivial details they manage to factor out. A weak understanding of their own emotions also leads to a weaker understanding of other's emotions. You can become a better empathizer by better understanding yourself.
You could start from scratch and build your own framework, but self-assessment biases will impede productivity. Learning an existing framework is a more realistic solution. You can find resources with some light googling and I'm sure CFAR teaches some good ones too. You can improve your gut feelings too. One way is making sure you're always consciously aware of the circumstances you're in when experiencing an emotion.
Making rules too strict
Logical thinkers build frameworks in order to understand things. When adding a new rule to a framework, there's motivation to make the rule strict. The stricter the rule, the more predictive power, the better the framework. When the domain you're trying to understand has multivariable chaotic phenomena, strict rules are likely to break. The result is something like the current state of macroeconomics: a bunch of logical thinkers preoccupied by elegant models and theories that can only exist when useless in practice.
Following rules that are too strict can have bad consequences. Imagine John the salesperson is learning how to make better first impressions and has built a rough framework so far. John has a rule that smiling always helps make people feel welcomed the first time they meet him. One day he makes a business trip to Russia to meet with a prospective client. The moment he meet his russian client, he flashes a big smile and continues to smile despite negative reactions. After a few hours of talking, his client reveals she felt he wasn't trustworthy at first and almost called off the meeting. Turns out that in Russia smiling to strangers is a sign of insincerity. John's strict rule didn't account for cultural differences, blindsiding him from updating on his clients reaction, putting him in a risky situation.
The desire to hold onto strict rules can make logical thinkers susceptible to confirmation bias too. If John made an exception to his smiling rule, he'd feel less confident about his knowledge of making first impressions, subsequently making him feel bad. He may also have to amend some other rule that relates to the smiling rule, which would further hurt his framework and his feelings.
When feeling the urge to add on a new rule, take note of circumstances in which the evidence for the rule was found in. Add exceptions that limit the rule's predictive power to similar circumstances. Another option is to entertain multiple conflicting rules simultaneously, shifting weight from one to the other after gathering more evidence.
Anyone have more biases/tendencies to add?