I just started reading Julia Galef's new book "The Scout Mindset: Why Some People See Things Clearly and Others Don't".  Here's a description:
 

When it comes to what we believe, humans see what they want to see. In other words, we have what Julia Galef calls a "soldier" mindset. From tribalism and wishful thinking, to rationalizing in our personal lives and everything in between, we are driven to defend the ideas we most want to believe—and shoot down those we don't.

But if we want to get things right more often, argues Galef, we should train ourselves to have a "scout" mindset. Unlike the soldier, a scout's goal isn't to defend one side over the other. It's to go out, survey the territory, and come back with as accurate a map as possible. Regardless of what they hope to be the case, above all, the scout wants to know what's actually true.

In The Scout Mindset, Galef shows that what makes scouts better at getting things right isn't that they're smarter or more knowledgeable than everyone else. It's a handful of emotional skills, habits, and ways of looking at the world—which anyone can learn. With fascinating examples ranging from how to survive being stranded in the middle of the ocean, to how Jeff Bezos avoids overconfidence, to how superforecasters outperform CIA operatives, to Reddit threads and modern partisan politics, Galef explores why our brains deceive us and what we can do to change the way we think.


This seems like a good book to do a read-along to, since there are probably a decent number of people reading it at the same time. 

If possible put your comments under the correct chapter parent comment as you go. If you get further ahead than me, feel free to create a new chapter parent comment.

New Comment
44 comments, sorted by Click to highlight new comments since:

Chapter 15: A Scout Identity

Section: You Can Choose Your Communities Online

For all that people complain about how toxic Twitter, Facebook, and the rest of the internet can be, they don't often seem to put in much effort to crafting a better online experience for themselves. Sure, there are plenty of trolls, overconfident pundits, uncharitable talk-show hosts, and intellectually dishonest influencers, but you don't have to give them your attention. You can choose to read, follow, and engage with the exceptions to the rule instead.

Well I gotta strongly disagree with this part. While it's true that most complainers put hardly any effort in, the actual effort required to do what she suggests requires monastic dedication. Psychotoxic internet content is highly addictive for many people and our infrastructure amplifies and spreads it.

I'm pretty concerned by things like state-sponsored polarization campaigns and the apparent memetic collapse, so I can't help but feel like the quoted passage is kind of sweeping aside some pretty big stuff.

While it's true that most complainers put hardly any effort in, the actual effort required to do what she suggests requires monastic dedication.

Yes, but also monks are real and you could actually become one if you tried?

Delete your Twitter and Facebook accounts and get off social media. I recommend it strongly. It works very well to remove yourself from that sort of toxicity and such-like things. Living without social media is entirely doable, which I know because I do it. I’m not even a monk.

Oh wow. There is an example of a person who used to be certain they didn't want kids and changed their mind later, but felt awkward about it because older people used to be very patronizing about her desire to not have kids and would assure her that she'd change her mind when she got older.

This is me. Practically word for word how I've written about it. I would be certain this was literally me if it weren't for the fact that I'd expect Julia to have mentioned if she were using me as an example. And I know Scott Alexander has talked about really common issues amongst his clients that feel personal enough that he thinks if his clients read them they'd think they were talking about them in particular as opposed to a general thing. And it's probably not an uncommon thought to express.

But it's the most I've ever felt that feeling of "that is literally me being quoted there" before. And I feel like maybe I'm a paranoid person.

FWIW that sentence ends with a citation of a forum post written in 2014, so unless you're saratiara2 on WeddingBee, you can probably be confident that it isn't you.

Thanks!  (Updating accordingly)

Review/Overview Thread

(I originally posted this to Goodreads)

TDLR: A good book with mass appeal to help people care more about being accurate. Fairly easy to read, which makes it easy to recommend to many people.

I've met Julia a few times and am friendly with her. I'd be happy if this book does well, and expect that to lead to a (slightly) more reasonable world.

That said, in the interest of having a Scout Mindset, I want to be honest about my impression.

The Scout Mindset is the sort of book I'm both happy with and frustrated by. I'm frustrated because this is a relatively casual overview of what I wish were a thorough Academic specialty. I felt similarly with The Life You Can Save when that was released.

Another way of putting this is that I was sort of hoping for an academic work, but instead, think of this more as a journalistic work. It reminds me of Vice Documentaries (which I like a lot) and Malcolm Gladwell (in a nice way), instead of Superforecasting or The Elephant in the Brain. That said, journalistic works have their unique contributions in the literature, it's just a very different sort of work.

I just read through the book on Audible and don't have notes. To write a really solid review would take more time than I have now, so instead, I'll leave scattered thoughts.

1. The main theme of the book is the dichotomy of "The Scout Mindset" vs. "The Soldier Mindset", and more specifically, why the Scout Mindset is (almost always?) better than the Solider Mindset. Put differently, we have a bunch of books about "how to think accurately", but surprisingly few on "you should even try thinking accurately." Sadly, this latter part has to be stated, but that's how things are.

2. I was expecting a lot of references to scientific studies, but there seemed to be a lot more text on stories and a few specific anecdotes. The main studies I recall were a very few seemingly small psychological studies, which at this point I'm fairly suspect of. One small note: I found it odd that Elon Musk was described multiple times as something like an exemplar of honesty. I agree with the particular examples pointed to, but I believe Elon Musk is notorious for making explicit overconfident statements.

3. Motivated reasoning is a substantial and profound topic. I believe it already has many books detailing not only that it exists, but why it's beneficial and harmful in different settings. The Scout Mindset didn't seem to engage with much of this literature. It argued that "The Scout Mindset is better than the Soldier Mindset", but that seems like an intense simplification of the landscape. Lies are a much more integral part of society than I think they are given credit for here, and removing them would be a very radical action. If you could go back in time and strongly convince particular people to be atheistic, that could be fatal.

4. The most novel part to me was the last few chapters, on "Rethinking Identity". This section seems particularly inspired by the blog post Keep Your Identity Small by Paul Graham, but of course, goes into more detail. I found the mentioned stories to be a solid illustration of the key points and will dwell on these more.

5. People close to Julia's work have heard much of this before, but maybe half or so seemed rather new to me.

6. As a small point, if the theme of the book is about the benefits of always being honest, the marketing seemed fairly traditionally deceiving. I wasn't sure what to expect from the cover and quotes. I could easily see potential readers getting the wrong impression looking at the marketing materials, and there seems to be little work to directly make the actual value of the book more clear. There's nothing up front that reads, "This book is aiming to achieve X, but doesn't do Y and Z, which you might have been expecting." I guess that Julia didn't have control over the marketing.

Hey Ozzie! Thanks for reading / reviewing.

I originally hoped to write a more “scholarly” book, but I spent months reading the literature on motivated reasoning and thought it was mostly pretty bad, and anyway not the actual cause of my confidence in the core claims of the book such as “You should be in scout mindset more often.” So instead I focused on the goal of giving lots of examples of scout mindset in different domains, and addressing some of the common objections to scout mindset, in hopes of inspiring people to practice it more often. 

I left in a handful of studies that I had greater-than-average confidence in (for various reasons, which I might elaborate on in a blog post – e.g. I felt they had good external validity and no obvious methodological flaws). But I tried not to make it sound like those studies were definitive, nor that they were the main cause of my belief in my claims.

Ultimately I’m pretty happy with my choice. I understand why it might be disappointing for someone expecting a lot of research... but I think it's an unfortunate reality, given the current state of the social sciences, that books which cite a lot of social science studies tend to give off an impression of rigor that is not deserved.

I am really glad about this choice, and also made similar epistemic updates over the last few years, and my guess is if I was to write a book, I would probably make a similar choice (though probably with more first-principles reasoning and a lot more fermi-estimates, though the latter sure sounds like it would cut into my sales :P).

Thanks! I do also rely to some extent on reasoning... for example, Chapter 3 is my argument for why we should expect to be better off with (on the margin) more scout mindset and less soldier mindset, compared to our default settings. I point out some basic facts about human psychology (e.g., the fact that we over-weight immediate consequences relative to delayed consequences) and explain why it seems to me those facts imply that we would have a tendency to use scout mindset less often than we should, even just for our own self interest.

The nice thing about argumentation (as compared to citing studies) is that it's pretty transparent -- the reader can evaluate my logic for themselves and decide if they buy it.

That's good to hear! I haven't yet gotten super far into the book, so can't judge for myself yet, and my guess about doing more first-principles reasoning was mostly based on priors.

Thanks so much, that makes a lot of sense.

Reviewing works can be tricky, because I'd focus on very different aspects when targeting different people. When describing books to potential readers, I'd focus on very different aspects than when trying to comment on how good of a job the author did to advance the topic. 

In this case the main issue is that I wasn't sure what kind of book to expect, so wanted to make that clear to other potential readers. It's like when a movie has really scary trailers but winds up being being a nice romantic drama

Some natural comparison books in this category are Superforecasting and Thinking Fast and Slow, where the authors basically took information from decades of their own original research. Of course, this is an insanely high bar and really demands an entire career. I'm curious how you would categorize The Scout Mindset. ("Journalistic?" Sorry if the examples I pointed to seemed negative)

I think you specifically did a really good job given the time you wanted to allocate to it (you probably didn't want to wait another 30 years to publish), but that specific question isn't particularly relevant to potential readers, so it's tricky to talk about all things at once.

I'd also note that I think there's also a lot of non-experimental work that could be done in the area, similar to The Elephant in the Brain, or many Philosophical works (I imagine habryka thinks similarly). This sort of work would probably sell much worse, but is another avenue I'm interested in for future research.

(About The Village, I just bring this up because it was particularly noted for people having different expectations from what the movie really was. I think many critics really like it at this point.)

Fwiw I basically expected the book to be more in the Malcolm Gladwell genre (and I don't say that pejoratively – it's generally seemed to me that Julia's strength and area of focus is in communicating concepts to a wider audience).

... By the way, you might've misunderstood the point of the Elon Musk examples. The point wasn't that he's some exemplar of honesty. It was that he was motivated to try to make his companies succeed despite believing that the most likely outcome was failure. (i.e., he is a counterexample to the common claim "Entrepreneurs have to believe they are going to succeed, or else they won't be motivated to try")

I guess that Julia didn’t have control over the marketing.

I'm sure she could have taken control, e.g. by self publishing. Which, if the book has a theme of pro-honesty, and if the marketing is deceptive in ways contra that theme, I do think we should look on this state of affairs with some amount of suspicion, which might then turn out to be unfounded. (I haven't read the book, and I don't know if either of those premises is true.)

TL;DR: Choir agrees preacher’s sermon was very interesting.

So yes, I read this book with no small amount of motivation to enjoy it as I like Julia’s other stuff and am often terrified by the misery that irrationality causes. This is likely not a very impartial comment.

If we assume the goal was to achieve maximum possible swing in total human rationality*, I think it was correct to write the book with a less academic tone than some would have liked. If there had been a load more Bayes’ Theorem in it, people like me would have enjoyed it slightly more, but many others would have stopped reading.

Getting fresh blood to appreciate the benefits of rationality is huge. Once they’re in, they can explore more academic/technical resources if they want.

And even if you are very familiar with the subject matter, you may still need a hand in stopping your soldier mindset barging around with his ridiculous ideas. I have a Zoom call with friends in a bit, and despite just having read The Scout Mindset and being in the middle of writing this sentence, I’ll probably still get too attached to my beliefs once we start talking politics. There’s plenty of low-hanging fruit out there when it comes to walking the talk.

*Whatever the actual goal was, I don't think this is a terrible proxy.

Chapter 1: Two Types of Thinking

  • Motivated Reasoning
    • "Can I believe this?" (searching for evidence something is true) v "Must I believe this?" (searching for evidence something is false)

"Can I believe this?" (searching for evidence something is true) v "Must I believe this?" (searching for evidence something is false)

Tim Urban makes the same point in The Thinking Ladder. In particular, his description of how a Sports Fan thinks, which was inspired by Jonathan Haidt.

Introduction: 
 

  • Scout Mindset is the ability to see things as they are, not as you wish they were. 
  • "Was I in the wrong in that argument?"
  • How do we NOT self-decieve?
    • Realize that truth isn't in conflict with your other goals
    • Learn tools that make it easier to see clearly, e.g. the Outsider Test
    • Appreciate the emotional rewards of Scout Mindset

Chapter 13: How Beliefs Become Identities

This chapter claims that identities tend to be things that people feel embattled about. Is this just a fact about contemporary English-speaking rich Western culture, or universal? You'd think that people could derive an identity about being in a group of powerful elites or something.

Feeling embattled is one of two sources of identity that the book mentions, the other being pride. Re-reading just now, I see that her examples of identity through pride were also embattled ones (formula/breast-feeding activists, cryptocurrency proponents, polyamorists), but it doesn't seem necessary, so patriot and gymbro identities fit in the pride category.

Good point!

A possible counterexample: I gather that many people identify as being part of the USA, which is the most powerful country on the planet. Do they think of themselves as beset by iniquity from all sides?

Another possible counterexample: being a gym bro is sort of an identity, but being a weak man isn't really. I imagine gym bros don't feel embattled?

There's the counter-identity of scorning people who "pick stuff up and put it down again" and calling all sports "sportsball", etc. 

I think it's related to what Julia mentioned about having an identity that's just against some other group. 

OK there's something important here I think. To some degree, I 'identify' as being an Australian, due in part to the fact that I am an Australian (but also in part to the fact that I don't live in Australia). But I don't think of Australians as an embattled group, and I also don't think this identity hinders my ability to reason about Australian affairs. So maybe there's a thing where there are different ways people can have identities that have different impacts on rationality.

It's hard to know what patriots "really think," but I loosely gather that the answer is yes? People in the militia movement are preparing to battle a tyrannical government. Many patriots seem to worry a lot about crime, terrorism, the threat of other powerful nations. The culture war makes people afraid of cultural as well as military threats.

Like, identities often feel 'morally powerful'. As the book quotes Megan McArdle saying: "The messages that make you feel great about yourself... are the ones that suggest you're a moral giant striding boldly across the landscape, wielding your inescapable ethical logic". What's so different about feeling like you're a literal giant striding boldly across the landscape, wielding your inescapable power?

Chapter 6: How Sure Are You?

You should never see a well-calibrated person say something's impossible for it to then happen. But you also shouldn't expect Spock's predictions in the episodes to be calibrated: when he predicts well and things go normally, that's less interesting and therefore not likely to be the sort of thing an episode gets made about! (assuming that Star Trek doesn't purport to show everything that happens to those people, which may or may not be right)

But they'd probably have to have years and years of correctly predicted boring missions to make up for the amount of incorrect 99% predictions, right?

Maybe the Star Trek universe has low key solved aging, so even though it doesn't seem like years between episodes, it really is. :P

Chapter 8: Motivation Without Deception

Is it too early to calculate Elon Musk's calibration? Tesla seems like a success by now, and you could argue that SpaceX is as well. That's at least two 10% predictions that came out true, so he'll need to fail with 18 similar companies if he wants Scouts to take his opinion seriously...

EDIT: This was a joke.

It could also be due to selection bias.

I think this is a reason why focusing on 'calibration' is sort of a mistake? Like, look, the thing it's doing is making the probabilities you say useful to other people / explicit EV calculations that you do yourself. It's one of the skills in the toolbox; you also want good concepts, you also want accuracy, and so on. 

Looking at the early section on motivational advice, I was reminded of Antifragile (my review, Scott's review). Motivational advice which assures success if one believes hard enough and encourages people to try for things despite long odds doesn't look like it helps those individuals. If this advice is widely spread and followed, who benefits? Possibly society as a whole. If individuals in general overestimate their chances of success, try, and largely fail, then there's a much larger pool to select from, and hopefully the best successes are better than they otherwise would be. Deceptive advice transfers antifragility from individuals to the system. 

On the same subject, I've long felt a disdain for that sort of motivational rhetoric as trite, but I'm still not sure why. The connection to self deception provided by Galef is one possible explanation. Has anyone else experienced something similar, or have an explanation for why that might be the case?

Meta Thread

Thank you for doing this! 

I was planning to listen to the book, and I hope I can get around to leaving some comments with thoughts here.

I'm considering buying this book. I would appreciate opinions from anybody who cares to answer.

How does this book compare to the sequences; does it explore topics which aren't covered by the sequences?

Does it function well as an introduction to rationality, as something which I could lend out to friends?

Does it function well as an introduction to rationality, as something which I could lend out to friends?

I think it's pretty good at this, but I note that it seems to me to be very much targeted at 'the general population' instead of 'rationalists' or 'rationalist-adjacents.' Many of the examples are political, and much of the motivation of the book is closer to "here's how you can shift a deep emotional orientation towards being right and wrong" than "given that you already have scout mindset, here's how to skill up at it."

Pre-Reading Thoughts

I expect this book to be well-written and have interesting examples, but I expect it will mostly cover ground I'm already familiar with. That's okay with me, the more I go over things, the more they get into my head and new examples help internalize thoughts in  a way factual knowledge doesn't.

I expect I will learn at least one new thing that isn't just an example. 

I expect that after listening to this book these ideas will be more in my head for the next week or month and so I will notice relevant issues and opportunities in my own life, which will help further internalize the ideas. But I also expect that big "in my head"-ness will diminish after a week or two. 

Meta - I often relate to things with personal examples, so that might be a lot of my commentary. 

The title and cover image reminds me of my grandfather who had been a Soviet scout in WW2. His job was to go far ahead closer to enemy lines, and radio back how to adjust their artillery fire to hit more accurately. He got shot out of a tree when the Germans saw the glint off his binoculars. Being a scout can be dangerous!

Julia Galef is a top-notch communicator. I hope to learn how she gets these ideas across to an audience less, um, obsessed with them than I am. I also hope that rather that I can lend the book to a few people and get these ideas across in somebody else’s words rather than my own.

Chapter 3: Why Truth Is More Valuable Than We Realize

Early in the chapter, Galef lays out examples of tradeoffs between Soldier and Scout mindset, most vivid for me in the anecdote of the charity president, who convinces himself that the budget is well spent, helping to gain donations but reducing actual effectiveness. 

Two questions which occurred to me reading this: First, is it possible to compartmentalize the Soldier and Scout mindsets to a significant degree, such that one can be used when soliciting donations and the other when deciding which projects to cut?

Second, if it is possible, is it desirable? What consequences might come about from trying to separate these two processes? Maybe doing so requires an extreme psychology or ability to self-deceive, or the effort to separate them is just too tiring to maintain. 

Chapter 2: What the Soldier Mindset Protects

  • Comfort, Self esteem, Morale, Pursuasion, Self Esteem, Belonging