Biases of Intuitive and Logical Thinkers
Any intuition-dominant thinker who's struggled with math problems or logic-dominant thinker who's struggled with small-talk knows how difficult and hopeless the experience feels like. For a long time I was an intuition thinker, then I developed a logical thinking style and soon it ended up dominating -- granting me the luxury of experiencing both kinds of struggles. I eventually learned to apply the thinking style better optimized for the problem I was facing. Looking back, I realized why I kept sticking to one extreme.
I hypothesize that one-sided thinkers develop biases and tendencies that prevent them from improving their weaker mode of thinking. These biases cause a positive feedback loop that further skews thinking styles in the same direction.
The reasons why one style might be overdeveloped and the other underdeveloped vary greatly. Genes have a strong influence, but environment also plays a large part. A teacher may have inspired you to love learning science at a young age, causing you to foster to a thinking style better for learning science. Or maybe you grew up very physically attractive and found socializing with your peers a lot more rewarding than studying after school, causing you to foster a thinking style better for navigating social situations. Environment can be changed to help develop certain thinking styles, but it should be supplementary to exposing and understanding the biases you already have. Entering an environment that penalizes your thinking style can be uncomfortable, stressful and frustrating without being prepared. (Such a painful experience is part of why these biases cause a positive feedback loop, by making us avoid environments that require the opposite thinking style.)
Despite genetic predisposition and environmental circumstances, there's room for improvement and exposing these biases and learning to account for them is a great first step.
Below is a list of a few biases that worsen our ability to solve a certain class of problems and keep us from improving our underdeveloped thinking style.
Intuition-dominant Biases
Overlooking crucial details
Details matter in order to understand technical concepts. Overlooking a word or sentence structure can cause complete misunderstanding -- a common blunder for intuition thinkers.
Intuition is really good at making fairly accurate predictions without complete information, enabling us to navigate the world without having a deep understanding of it. As a result, intuition trains us to experience the feeling we understand something without examining every detail. In most situations, paying close attention to detail is unnecessary and sometimes dangerous. When learning a technical concept, every detail matters and the premature feeling of understanding stops us from examining them.
This bias is one that's more likely to go away once you realize it's there. You often don't know what details you're missing after you've missed them, so merely remembering that you tend to miss important details should prompt you to take closer examinations in the future.
Expecting solutions to sound a certain way
The Internship has a great example of this bias (and a few others) in action. The movie is about two middle-aged unemployed salesmen (intuition thinkers) trying to land an internship with Google. Part of Google's selection process has the two men participate in several technical challenges. One challenge required the men and their team to find a software bug. In a flash of insight, Vince Vaughn's character, Billy, shouts "Maybe the answer is in the question! Maybe it has something to do with the word bug. A fly!" After enthusiastically making several more word associations, he turns to his team and insists they take him seriously.
Why is it believable to the audience that Billy can be so confident about his answer?
Billy's intuition made an association between the challenge question and riddle-like questions he's heard in the past. When Billy used his intuition to find a solution, his confidence in a riddle-like answer grew. Intuition recklessly uses irrelevant associations as reasons for narrowing down the space of possible solutions to technical problems. When associations pop in your mind, it's a good idea to legitimize those associations with supporting reasons.
Not recognizing precise language
Intuition thinkers are multi-channel learners -- all senses, thoughts and emotions are used to construct a complex database of clustered knowledge to predict and understand the world. With robust information-extracting ability, correct grammar/word-usage is, more often than not, unnecessary for meaningful communication.
Communicating technical concepts in a meaningful way requires precise language. Connotation and subtext are stripped away so words and phrases can purely represent meaningful concepts inside a logical framework. Intuition thinkers communicate with imprecise language, gathering meaning from context to compensate. This makes it hard for them to recognize when to turn off their powerful information extractors.
This bias explains part of why so many intuition thinkers dread math "word problems". Introducing words and phrases rich with meaning and connotation sends their intuition running wild. It's hard for them to find correspondences between words in the problem and variables in the theorems and formulas they've learned.
The noise intuition brings makes it hard to think clearly. It's hard for intuition thinkers to tell whether their automatic associations should be taken seriously. Without a reliable way to discern, wrong interpretations of words go undetected. For example, without any physics background, an intuition thinker may read the statement "Matter can have both wave and particle properties at once" and believe they completely understand it. Unrelated associations of what matter, wave and particle mean, blindly take precedence over technical definitions.
The slightest uncertainty about what a sentence means should raise a red flag. Going back and finding correspondence between each word and how it fits into a technical framework will eliminate any uncertainty.
Believing their level of understanding is deeper than what it is
Intuition works on an unconscious level, making intuition thinkers unaware of how they know what they know. Not surprisingly, their best tool to learn what it means to understand is intuition. The concept "understanding" is a collection of associations from experience. You may have learned that part of understanding something means being able to answer questions on a test with memorized factoids, or knowing what to say to convince people you understand, or just knowing more facts than your friends. These are not good methods for gaining a deep understanding of technical concepts.
When intuition thinkers optimize for understanding, they're really optimizing for a fuzzy idea of what they think understanding means. This often leaves them believing they understand a concept when all they've done is memorize some disconnected facts. Not knowing what it feels like to have deeper understanding, they become conditioned to always expect some amount of surprise. They can feel max understanding with less confidence than logical thinkers when they feel max understanding. This lower confidence disincentivizes intuition thinkers to invest in learning technical concepts, further keeping their logical thinking style underdeveloped.
One way I overcame this tendency was to constantly ask myself "why" questions, like a curious child bothering their parents. The technique helped me uncover what used to be unknown unknowns that made me feel overconfident in my understanding.
Logic-dominant Biases
Ignoring information they cannot immediately fit into a framework
Logical thinkers have and use intuition -- problem is they don't feed it enough. They tend to ignore valuable intuition-building information if it doesn't immediately fit into a predictive model they deeply understand. While intuition thinkers don't filter out enough noise, logical thinkers filter too much.
For example, if a logical thinker doesn't have a good framework for understanding human behavior, they're more likely to ignore visual input like body language and fashion, or auditory input like tone of voice and intonation. Human behavior is complicated, there's no framework to date that can make perfectly accurate predictions about it. Intuition can build powerful models despite working with many confounding variables.
Bayesian probability enables logical thinkers to build predictive models from noisy data without having to use intuition. But even then, the first step of making a Bayesian update is data collection.
Combatting this tendency requires you to pay attention to input you normally ignore. Supplement your broader attentional scope with a researched framework as a guide. Say you want to learn how storytelling works. Start by grabbing resources that teach storytelling and learn the basics. Out in the real-world, pay close attention to sights, sounds, and feelings when someone starts telling a story and try identifying sensory input to the storytelling elements you've learned about. Once the basics are subconsciously picked up by habit, your conscious attention will be freed up to make new and more subtle observations.
Ignoring their emotions
Emotional input is difficult to factor, especially because you're emotional at the time. Logical thinkers are notorious for ignoring this kind of messy data, consequently starving their intuition of emotional data. Being able to "go with your gut feelings" is a major function of intuition that logical thinkers tend to miss out on.
Your gut can predict if you'll get along long-term with a new SO, or what kind of outfit would give you more confidence in your workplace, or if learning tennis in your free time will make you happier, or whether you prefer eating a cheeseburger over tacos for lunch. Logical thinkers don't have enough data collected about their emotions to know what triggers them. They tend to get bogged down and mislead with objective, yet trivial details they manage to factor out. A weak understanding of their own emotions also leads to a weaker understanding of other's emotions. You can become a better empathizer by better understanding yourself.
You could start from scratch and build your own framework, but self-assessment biases will impede productivity. Learning an existing framework is a more realistic solution. You can find resources with some light googling and I'm sure CFAR teaches some good ones too. You can improve your gut feelings too. One way is making sure you're always consciously aware of the circumstances you're in when experiencing an emotion.
Making rules too strict
Logical thinkers build frameworks in order to understand things. When adding a new rule to a framework, there's motivation to make the rule strict. The stricter the rule, the more predictive power, the better the framework. When the domain you're trying to understand has multivariable chaotic phenomena, strict rules are likely to break. The result is something like the current state of macroeconomics: a bunch of logical thinkers preoccupied by elegant models and theories that can only exist when useless in practice.
Following rules that are too strict can have bad consequences. Imagine John the salesperson is learning how to make better first impressions and has built a rough framework so far. John has a rule that smiling always helps make people feel welcomed the first time they meet him. One day he makes a business trip to Russia to meet with a prospective client. The moment he meet his russian client, he flashes a big smile and continues to smile despite negative reactions. After a few hours of talking, his client reveals she felt he wasn't trustworthy at first and almost called off the meeting. Turns out that in Russia smiling to strangers is a sign of insincerity. John's strict rule didn't account for cultural differences, blindsiding him from updating on his clients reaction, putting him in a risky situation.
The desire to hold onto strict rules can make logical thinkers susceptible to confirmation bias too. If John made an exception to his smiling rule, he'd feel less confident about his knowledge of making first impressions, subsequently making him feel bad. He may also have to amend some other rule that relates to the smiling rule, which would further hurt his framework and his feelings.
When feeling the urge to add on a new rule, take note of circumstances in which the evidence for the rule was found in. Add exceptions that limit the rule's predictive power to similar circumstances. Another option is to entertain multiple conflicting rules simultaneously, shifting weight from one to the other after gathering more evidence.
Learning critical thinking: a personal example
Related to: Is Rationality Teachable
“Critical care nursing isn’t about having critically ill patients,” my preceptor likes to say, “it’s about critical thinking.”
I doubt she's talking about the same kind of critical thinking that philosophers are, and I find that definition abstract anyway. There’s been a lot of talk about critical thinking during our four years of nursing school, but our profs seem to have a hard time defining it. So I’ll go with a definition from Google.
Critical thinking can be seen as having two components: 1) a set of information and belief generating and processing skills, and 2) the habit, based on intellectual commitment, of using those skills to guide behaviour. It is thus to be contrasted with: 1) the mere acquisition and retention of information alone, because it involves a particular way in which information is sought and treated; 2) the mere possession of a set of skills, because it involves the continual use of them; and 3) the mere use of those skills ("as an exercise") without acceptance of their results.1
That’s basically rationality–epistemic, i.e. generating true beliefs, and instrumental, i.e. knowing how to use them to achieve what you want. Maybe part of me expected, implicitly, to have an easier time learning this skill because of my Less Wrong knowledge. And maybe I am more consciously aware of my mistakes, and the cognitive factors that caused them, than most of my classmates. When it’s forty-five minutes past the end of my shift and I’m still charting, I’m also calling myself out on succumbing to the planning fallacy. I once went through the first half hour of a shift during my pediatrics rotation thinking that one of my patients had cerebral palsy, when he actually had cystic fibrosis–all because I misread my prof’s handwriting as ‘CP’ when she’d written ‘CF’. I was totally confused by all the enzyme supplements on his list of meds, but it still took me a while to figure it out–a combination of priming and confirmation bias, taken to the next level.
But, overall, even if I know what I'm doing wrong, it hasn’t been easier to do things right. I have a hard time with the hospital environment, possibly because I’m the kind of person who ended up reading and posting on Less Wrong. My cognitive style leans towards Type 2 reasoning, in Keith Stanovich’s taxonomy–thorough, but slow. I like to understand things, on a deep level. I like knowing why I’m doing something, and I don’t trust my intuitions, the fast-and-dirty product of Type 1 reasoning. But Type 2 reasoning requires a lot of working memory, and humans aren’t known for that, which is the source of most of my frustration and nearly all of my errors–when working memory overload forces me to be a cognitive miser.
Still, for all the frustration, I’m pretty sure I’ve ended up in the perfect environment to learn this skill called ‘critical thinking.’ I’m way out of my depth–which I expected. No fourth year student is ready to work independently in a trauma ICU, but I decided to finish my schooling here in the name of tsuyoku naritai, and for all the days when I’ve gone home crying, it’s still worth it. I’m learning.
The skills
1. A set of information and belief generating and processing skills.
Medicine, and nursing, are a bit like physics, in that you need to generate true beliefs about systems that exist outside of you, and predict how they’re going to behave. This involves knowing a lot of abstract theory, which I’m good at, and a lot of heuristics and pattern-matching for applying the right bits of theory to particular patients, which I’m less good at. That’s partly an experience thing; my brain needs patterns to match to. But in general, I have decent mental models of my patients. I’m curious and I like to understand things. If I don’t know what part of the theories applies, I ask.
2. The habit, based on intellectual commitment, of using those skills to guide behaviour.
So you’ve got your mental model of your patient, your best understand of what’s actually going on, on a physiological and biochemical level, down under the skin where you can’t see it. You know what “normal” is for a variety of measures: vital signs, lung sounds, lab values, etc. Given that your patient is in the ICU, you know something’s abnormal, or they wouldn’t be there. Their diagnosis tells you what to expect, and you look at the results of your assessments and ask a couple of questions. One: is this what I expect, for this patient? Two: what do I need to do about it?
I’m not going to be surprised if a post-op patient has low hemoglobin. It’s information of a kind, telling the doctor whether or not the patient needs a transfusion, and how many units, but it’s not really new information, and a moderately abnormal value wouldn’t worry me or anyone else. If their hemoglobin keeps dropping; okay, they’re actively bleeding somewhere, that’s irritating, and possibly dangerous, and needs dealing with, but it’s not surprising.
But if a patient here for an abdominal surgery suddenly has decreased level of consciousness and their pupils aren’t reacting normally to light, I’m worried. There’s nothing in my mental model that says I should expect it. I notice I’m confused, and that confusion guides my behaviour; I call the doctor right away, because we need more information to update our collective mental model, information you can’t get just from observation, like a CT scan of the head. (Even this is optimistic–plenty of patients are admitted to the ICU because we have no idea what’s wrong with them, and are hoping to keep them alive long enough to find out.)
The basics of ICU nursing come down to treating numbers. Heart rate, blood pressure, oxygen saturations, urine output, etc; know the acceptable range, notice if they change, and use Treatment X to get them back where they’re supposed to be. Which doesn’t sound that hard. But implicit in ‘notice if they change’ is ‘figure out why they changed’, because that affects how you treat them, and implicit in that is a lot of background knowledge, which has to be put in context.
I’m, honestly, fairly terrible at this. It’s a compartmentalization thing. I don’t like using my knowledge as input arguments to generate new conclusions and then relying on those conclusions to treat human beings. It feels like guessing. Even though, back in high school, I never really needed to study for physics tests–if I understood what we’d learned, I could re-derive forgotten details from first principles. But hospital patients ended up in a non-overlapping magisterium in my head. In order for me to trust my knowledge, it has to have come directly from the lips of a teacher or experienced nurse.
My preceptor, who hates this. “She needs to continue to work on her critical thinking when it comes to caring for critically ill patients,” she wrote on my evaluation. “She knows the theory, and is now working to apply it to ICU nursing.” Shorthand for, she knows the theory, but getting her to apply it to ICU nursing is like pulling teeth. A number of our conversations have gone like this:
Me: “Our patient’s blood pressure dropped a bit.”
Her: “Yeah, it did. What do you want to do about it?”
Me: “I, uh, I don’t know... Should I increase the vasopressors?”
Her: “I don’t know, should you?”
Me: “Uh, maybe I should increase the phenylephrine to 40 mcg/min and see what happens. How long should I wait to see?”
Her: “You tell me.”
Me: “Well, let’s say it’ll take a few minutes for what’s in the tubing now to get pushed through, and it should take effect pretty quickly because it’s IV, like a minute... So if his blood pressure’s not up enough in five minutes, I’ll increase the phenyl to 60. Does that sound okay?”
Her: “It’s your decision to make."
Needless to say, I find this teaching method extremely stressful and scary, and I’m learning about ten times more than I would if she answered the questions I asked. Because “the mere acquisition and retention of information alone” isn’t my problem. I have a brain like an encyclopaedia. My problem, in the critical care nursing context, is the “particular way in which information is sought and treated.” I need to know the right time to notice something is wrong, the right place to look in my encyclopaedia, and the right way to take the information I just looked up and figure out what to do with it.
The mistakes
Some of my errors, unsurprisingly, boil down to a failure to override inappropriate Type 1 responses with Type 2 responses–in other words, not thinking about what I’m doing. But most of them are more of a mindware gap–I don’t yet have the “domain-specific knowledge sets” that the nurses around me have. Not just theory knowledge; I do have most of that; but the procedural habits of how to stay organized and prioritize and dump the contents of my working memory onto paper in a way that I can read them back later. Usually, when I make a mistake, I knew better, but the part of my brain that knew better was doing something else at the time, that small note of confusion getting lost in the general chaos.
Pretty much all nurses keep a “feuille de route”–I have yet to find a satisfactory English word for this, but it’s a personal sheet of paper, not legal charting, usually kept in a pocket, and used as an extended working memory. In med/surg, when I had four patients, I made a chart with four columns; name and personal information, medications, treatments/general plan for the day, and medical history; and as many rows as I had patients. If something was important, I circled it in red ink. This system doesn’t work in the ICU, so my current feuille de route has several aspects. I fold a piece of blank paper into four, and take notes from the previous shift report on one quarter of one side, or two quarters if it’s a long report. Across from that, I draw a vertical column of times, from 8:00 am to 6:00 pm (or 8:00 pm to 6:00 am). 7:00 pm and 7:00 am are shift change, so nothing else really gets done for that hour. I use this to scribble down what I need to get down during my twelve hours, and approximately when I want to do it, and I prioritize, i.e. from 1 to 5 most to least important. Once it’s done, I cross it off–then I can forget about it. On the other side of the paper, I make a cheat sheet for giving report to the next nurse, or presenting my patient to the doctors at rounds.
This might be low-tech and simple, but it takes a huge load off my working memory, and reduces my most frequent error, which is to get so overwhelmed and frazzled that my brain goes on strike. In other words, the failure to override Type 1 responses due to the lack of cognitive capacity to run a Type 2 process. It’s drastically cut down on the frequency of this mental conversation:
Me: “I turned off the sedation, and my patient isn’t waking up as fast as I expected. I notice I’m confused–”
My brain: “You’re always confused! Everything around here is intensely confusing! How am I supposed to use that as information?”
Odd as it might sound, I often don’t notice when my brain starts edging towards a meltdown. The feeling itself is quite recognizable, but the circumstances that lead to it, i.e. overloaded working memory, mean that I’m not usually paying attention to my own feelings.
“You need to stop and take a breath,” my preceptor says about fifty times a day. Easier said than done–but it’s more efficient, overall, to have a tiny part of my mind permanently on standby, keeping an eye on my emotions, noticing when the gears start to overheat. Then stop, take a breath, and let go of everything except the task at hand, trusting myself to have created enough cues in my environment to retrieve the other tasks, once I’m done. Humans don’t multitask well. Doing one thing while trying to remember a list of five others is intense multitasking, and it’s no wonder it’s exhausting.
The implications
“You can’t teach critical thinking,” my preceptor says, but I’m pretty sure that’s exactly what she’s doing right now. A great deal of what I already know is domain-specific to nursing, but most of what I’m learning right now is generally applicable. I’m learning the procedural skills to work through difficult problems, under what Keith Stanovich would call average rather than optimal conditions. Sitting in my own little bubble in front of a multiple choice exam–that’s optimal conditions. Trying to figure out if I should be surprised or worried about my patient’s increased heart rate, while simultaneously deciding whether or not I can ignore the ventilator alarm and whether I can finish giving my twelve o’clock antibiotic before I need to do twelve o’clock vitals–that’s not just average conditions, it’s under-duress conditions.
I’m hoping that after a few more weeks, or maybe a few more years, I’ll be able to perform comfortably in this intensely terrifying environment. And I’m hoping that some of the skills I learn will be general-purpose, for me at least. It’d be nice if they were teachable to others, too, but I think my preceptor might be right about one thing–you can’t teach this kind of critical thinking in the classroom. It's about moulding my brain into the right shape, and everyone's brain starts out in a different shape, so the mould has to be personalized.
But the habits are general ones. Notice when you're faced with a difficult problem, or making an important decision. Notice that you're doing this while distracted. Stop and take a breath. Get out a piece of paper. Figure out how the problem is formatted in your mind, and format it that way on the paper. (This is probably the hardest part). Dump your working memory and give yourself space to think. Prioritize from 1 to n. Keep an eye on the evolving situation, sure, but find that moment of concentration in the midst of chaos, and solve the problem.
Of course, it's far from guaranteed that this will work. I'm making an empirical prediction; that the skills I'm currently learning will be transferable to non-nursing areas, and that they'll make a difference in my life outside of work. I'll be on the lookout for examples, either of success or failure.
References
Scriven, Michael; Paul, Richard. Defining critical thinking. (2011). The critical thinking community. http://www.criticalthinking.org/pages/defining-critical-thinking/410
Positive Thinking
If I were to take all of my friends and divide them into two groups, there are plenty of criteria I could choose, but probably the most relevant slice would be between my friends who believe in God, and my friends who don’t.
Many in the believer group know each other as well. The evangelical Christian community in my city is fairly tight-knit. Every once in a while I’ll meet someone new, I’ll mention offhand something about church, it’ll become the topic of conversation, and suddenly we discover that we share a dozen mutual friends.
My non-believer friends come from all walks of life. My old friends from high school fit in this category; so do many of the friends I’ve met through university or part-time jobs. There’s no tight-knit community here. I wouldn’t describe many of them as rationalists, particularly, but it seems that according to lesswrong doctrine, they are above the sanity waterline while my first friend group is below.
Something about this bothers me. Maybe it’s because I find it so refreshing to be with a group of people who are relentlessly positive about life, who constantly remind one another to be positive, and who offer concrete help rather than judgement. Once, when another of our friends couldn’t pay her rent, my Christian friend and I got up at four, took out five hundred dollars in cash at a convenience store, and biked to her house to leave it anonymously in her mailbox before I left for my six am shift at work. The high lasted all day. I can’t think of any other community where this would happen, where it would even be socially acceptable.
I met people at church who had survived the worst circumstances; they had been abused, they had been addicts, they had been homeless. But aside from the concrete help they’d found at church, they’d found some kind of hope as well. They believed that they could succeed. I’ve been incredibly lucky in my life, and I’ve never had reason to doubt that I would succeed, or that people would be there to help me if I ever failed. But for people who’ve only seen evidence that they will fail and be stepped on, the benefits of being told that God loves them unconditionally seem to be non-trivial.
Now to contrast with my non-religious friends; this isn’t universally true, but I’ve seen a trend of general negative-ness. This attitude can be self-directed, i.e. complaining about work or school or relationships without any effort to find solutions. I know some very unhappy people, and it seems insane to me that they just sit back and take it, month after month. The negative attitude can also be directed outwards into biting sarcasm and rude, judgemental comments about others. This often comes from people who seem happy enough with their own lives. Maybe I didn’t notice this as much before I started going to church, where it became obvious in its absence.
I have the same tendencies to criticize and judge as anyone, but at least I notice them and try to keep them in check. I try to ask myself if it really helps to criticize someone. Does whatever I think they’re doing wrong really affect me? Is it my business to correct them? Would they listen to criticism? If I’m a reliable example, most people hate being criticized. It takes a conscious effort to step back and see criticism in a positive light. I try to take this step, and maybe most rationalists-in-the-making do the same, but that’s not the general population, and starting with a criticism tends to close people off and put them on the defensive. The last question I ask myself is, do I want to help them by suggesting a change, or do I only want to vent my own frustration? Venting doesn’t help them, and it doesn’t help me, because for me anyway, focusing on the negative side of an issue tends to flip my entire mindset into the negative. And negative attitudes are contagious. If one person at work is ranting about a bad breakup or a fight with their family, I’ll often catch myself brooding about someone or something I’m annoyed with. If I’m lucky and I’m paying attention, I notice the subliminal messaging before it really gets to be. Sometimes I feel like barking “hey, keep your problems to yourself, I’m trying to be positive here.” But again, if I’m paying attention to my own reactions, I ask myself if it’ll really help to snap at them, and the answer is no, so I’ll try to be an understanding listener.
These are things I do consciously, but since I stopped going to church regularly, I’ve noticed that it’s more of an effort. It feels like I’m holding up a heavy weight alone, going through my day talking to roommates and classmates and co-workers who don’t make any special effort to be positive or non-judgemental or helpful. And as soon as I let down my guard, I slip back into the trap of reacting to criticism defensively instead of constructively, of snapping back on reflex, of making excuses for why I was rude to someone or left my dirty dishes in the sink. I hate the way I act in this default mode, but it’s easy to make excuses for that too. I tell myself that I’m tired, that I’m burnt out, that I can’t be everything to everyone. I tell myself it’s not fair that I try so much harder than everyone else.
At church, there was a marked lack of excuses. The general attitude was that you could be as strong as you needed to be, because it wasn’t your strength, it was God’s strength. The way I see it, it was more the combined strength of a community united by a common ideal. It was like a self-help group, but without the stigma. (Maybe the stigma is imaginary; I just know that I have a negative emotional reaction to self-help books and websites. I know this is probably counterproductive, but I can’t seem to get rid of it.)
I talk to some of my friends, the non-religious ones, and I notice that maybe half the time they’re grumpy or upset or angry or offended, and they don’t stop to think about it, or take the step away that would allow them to question and overcome those feelings. My Christian friends aren’t perfect, and they do occasionally slip into anger and frustration, but they often notice. They often bring it up afterwards, in front of the group, as an example of something they need to work on.
This is why, even though I don’t believe in God and would probably be incapable of it at this point, the last thing I want to do is judge people who believe. A lot of the time, they’ve found something that helps them. This is why I found it instrumentally rational, for six months, to go to youth group once a week and sing songs about Jesus. Happiness is a hard thing to pin down, but I liked myself better during that time. It’s easier to be generous when everyone is being generous around you; it’s easier to be kind and helpful when everyone else is acting that way too. It feels like being held accountable.
I don’t really know what this means. It’s hard to generalize, because I’m talking about people in my age group; most of us are poor and not settled in our lives, without firmly developed social networks. Maybe later on in life, people can make their own tight-knit communities without religion as binding glue; my parents, for example, have an incredibly extensive social group. And I certainly don’t want to imply that all Christian organizations are as open and welcoming as the one I attended. I’m sure than plenty of people have had bad experiences. But what I’ve seen suggests to me that my church (a Pentacostal evangelical Christian group, by the way) served a function in our city that wasn’t being filled by anything else.
It’s limited, of course, by the fact that its founders believe the Bible is literally true, even if they don’t apply that belief thoroughly. (This occasionally involves a tricky kind of doublethink, for example a person who denounces homosexuality when asked directly but who holds nothing against their homosexual friends.) Could the principles of rationality prompt a group of people to form this kind of community? I don’t know. But until then, I’m going to keep hanging out with Christians and sharing their positive thoughts.
Backchaining causes wishful thinking
Wishful thinking - believing things that make you happy - may be a result of adapting an old cognitive mechanism to new content.
View more: Next
= 783df68a0f980790206b9ea87794c5b6)
Subscribe to RSS Feed
= f037147d6e6c911a85753b9abdedda8d)