I'm really glad to hear that! Thanks for letting me know that it is something you've appreciated, knowing that makes me happy :)
Ok yeah, I think this is making sense to me now. Thanks!
As has been said, "Hesitation is always easy, rarely useful."
I think there were a couple extra "s"s ;)
That makes sense, although succeeding in that way at extinguishing mosquitoes requires a lot more than agency! Although it does help. So I guess I see why it would help. The OP sounds to me like it's implying that agency is enough, not just that it can help, but I guess there are a lot of situations where it is enough. Like donating to a charity or something. Am I thinking about this correctly?
I'm confused - how does being agenty help one get utility from compassion? I think part of my confusion is because these ideas are all pretty abstract; a concrete example would help.
There is a concept related to scout mindset and soldier mindset (helpful outline) that I'd like to explore. Let's call it an "adversarial mindset".
From what I gather, both scout mindset and soldier mindset are about beliefs. They apply to people who are looking at the world through some belief-oriented frame. Someone who takes a soldier mindset engages in directionally motivated reasoning and asks "Can/must I believe?" whereas a scout asks "Is it true?".
On the other hand, someone who is in an adversarial mindset is looking through some sort of "combat-oriented frame". If you say "I think your belief that X is true is wrong" to someone in an adversarial mindset, they might infer subtext of something like "You're dumb".
But despite being in this frame, they likely won't respond by saying "Hey, that was mean of you to say I'm dumb. I'm not dumb, I'm smart!" Instead, they'll likely respond by saying something closer to the object level like "Well I'm pretty sure it's right", but the subtext will be something more combative like "I won't let you push me around like that!".
Adversarial mindset isn't about beliefs, it's about self-esteem. Maybe?
There are various phenomena that make me think that a person is in an adversarial mindset. One such phenomenon is when someone is more likely to update their belief when you phrase your critique softly.
For example, imagine that instead of saying "I think your belief that X is true is wrong" you said "Hey, I could totally be off here, but do you think there's a chance that your belief about X might not be completely accurate?". And imagine that the person updated their belief in response to the soft phrasing but not the "hard" phrasing. If so, it seems to me that it isn't the belief in X that they are defending against. It's their identity of someone who isn't dumb (or something).
A related possibility is that instead of inferring subtext that is aimed at attacking them ("You're dumb"), they might adopt a dominance-oriented frame and infer subtext of "I'm dominant over you. Submit to me by conceding.", or something. I ran into a situation once where I got into an argument with a therapist of mine that wasn't productive, and I suspect that the reason why it wasn't productive is because she adopted a dominance-oriented frame.
It began by me mentioning that I think beliefs influence feelings. She said something along the lines of "if that were true my job would be a whole lot easier". I clarified that I don't think beliefs are the only thing that influences feelings -- at least not conscious, verbal beliefs as opposed to "emotional learnings" -- but, in some sort of pragmatic sense that I can't fully articulate, they play a role.
I came up with an example where Alice thinks it's a sunny day, is excited to go for a walk, opens the blinds, sees that it's raining, and then feels disappointed. And I explained that in this example I think moving from "belief that it is sunny" to "belief that it is raining" heavily influenced Alice's emotions to shift from excited to disappointed. I expected that the therapist would nod and agree, and then proceed to add nuance to her position that beliefs don't influence feelings. Instead, she dug her heels in and doubled down. I think there are many potential explanations for this other than a dominance-oriented mindset, but a dominance-oriented mindset feels pretty plausible to me here.
Coming back to Julia Galef's book The Scout Mindset and even to the art of rationality more broadly, I suspect that the adversarial mindset and other soldier-adjacent mindsets lead to a lot of "stuckness" and just generally get in our way. And I'm not just referring to "normies" here, I'm including the "rats", including myself!
This claim I'm trying to gesture at seems pretty "important if true". One of my favorite quotes:
My path to this book began in 2009, after I quit graduate school and threw myself into a passion project that became a new career: helping people reason out tough questions in their personal and professional lives. At first I imagined that this would involve teaching people about things like probability, logic, and cognitive biases, and showing them how those subjects applied to everyday life. But after several years of running workshops, reading studies, doing consulting, and interviewing people, I finally came to accept that knowing how to reason wasn't the cure-all I thought it was.
Knowing that you should test your assumptions doesn't automatically improve your judgement, any more than knowing you should exercise automatically improves your health. Being able to rattle off a list of biases and fallacies doesn't help you unless you're willing to acknowledge those biases and fallacies in your own thinking. The biggest lesson I learned is something that's since been corroborated by researchers, as we'll see in this book: our judgment isn't limited by knowledge nearly as much as it's limited by attitude.
- The Scout Mindset
I don't get the sense that we have a great understanding of why people adopt adversarial mindsets though, or how one can resist adversarial mindsets from slipping in. Seems like a topic worthy of more attention.
Thanks for the feedback!
I feel like this structure could be improved, as I don't think it would press the right psychological buttons. If I'm enthusiastic about the charity, it's strange to be (effectively) playing against it -- especially if there's a relatively large amount of money at stake, most of it conditionally donated by people other than me, so the altruistic benefit of losing the game clearly outweighs the selfish benefit of winning it.
That's a good point. Maybe it'd be better if it were like one team of humans against another. Although I could also see it being the case that people are in fact motivated to win: competitiveness, social pressure, not enough money to be particularly driven to have it donated to the charity.
find someone (or offer yourself) to match (or at least significantly augment) the donation if the human team wins
That sounds difficult.
A friend of mine from the local LessWrong meetup created Democracy Chess - a game where the human players play against an AI, vote on moves, and the move with the most votes is played. Yesterday a bunch of us gave it a try.
In brainstorming how it could be better I had an interesting thought: what if there was a $10 buy in where if we win we get our money back but if we lose the money is donated to some pre-agreed upon charity?
I could see something like that being fun. Having something at stake can make games more fun. And along the lines of charity runs, it can be fun to get together with a group of people and raise/donate money for a cause.
My angle here is that if something like this is genuinely fun and appealing, it could be a good way to generate money for charities. Along the lines of 1,000 True Fans, if 1,000 people spend $200/year on this (~$20/month) and won half the time, that'd be $100k in charitable donations a year. A meaningful sum.
I don't think Democracy Chess would be the ideal game. I think something like trivia would make more sense. Something that has more mass appeal and is more social. Trivia is particularly appealing since it'd be easy to build an prototype for.
One thing I like about this idea is that customer acquisition seems straightforward. I think direct outreach to EA and rationalist groups would make sense as an initial target audience. From there it seems plausible that other groups might be interested. For example, a local bike advocacy group might want to organize a trivia game where donations would go to a pro-biking organization. And direct outreach to such groups is very doable.
Longer term I could see celebrities promoting it. Maybe motivated by altruism, maybe motivated by virtue signaling, maybe a mix of both.
But I'm not feeling optimistic about this idea. The 6+ people I asked from the Portland rationality meetup weren't excited about it and wouldn't be motivated enough to organize a game. Maybe they'd join in if someone else organized it. And I myself fall into that boat: I'd tag along if someone else organized it but I wouldn't be very motivated to organize a game. And I think for this to work it'd require people who are excited and motivated enough to organize games on a recurring basis.
Maybe those people are out there though! Does anyone reading this fall into that camp?
That makes sense. Yeah from what I understand spices vary in how much time they can spend being sauteed. Ground spices have more surface area exposed and so will burn faster.
I've heard and experienced that freshly ground spices are notably better than pre-ground. It's relatively easy to do in a mortar and pestle or with a spice grinder. And sometimes it's good to dry roast them before grounding in order to "wake them up".
Yes, that visual diagram is very helpful. Thank you! I think I mostly get it now.
Ah yeah, the phenomena you mention resonate with me and seem like evidence in favor of this idea that there is a distinction between soldier-oriented mindsets that fight against new ideas and ones that fight against something more social.