For sure! It's a devilishly hard problem. Despite dipping in and out of the topic, I don't feel confident in even forming a problem statement about it. I feel more like one of the blind men touching different parts of an elephant.
But it seems like having many projects like the Verified Voting Foundation should hedge the risk--if each such project focuses on a small part, then the blast radius of unfortunate mistakes should be limited. I would just hope that, on average, we would be trending in the right direction.
When I last looked a couple of months back, I found very little discussion of this topic in the rationalist communities. The most interesting post was probably this one from 2021: https://forum.effectivealtruism.org/posts/8cr7godn8qN9wjQYj/decreasing-populism-and-improving-democracy-evidence-based
I supposed it's not a popular topic because it rubs up against politics. But I do think that liberal democracy is the operating system for running things like LW, EA, and other communities we all love. It's worth defending it--though what that means exactly is vague to me.
Did you find anything interesting in 2018? Did you use it, and, if yes, how'd it go?
How would you label Metz's approach in the dialogue with Zack? To me it's clear that Zack is engaging in truth-seeking--questioning maps, seeking where they differ, trying to make sense of noisy data.
But Metz is definitely not doing that. Plenty of Dark Arts techniques there, and his immediate goal is pretty clear (defend position & extract information), but I can't quite put a finger on his larger goal.
If Zack is doing truth-seeking, then Metz is doing...?
Seconding "Style: Ten Lessons in Clarity and Grace". Amazing explanation of effective written communication.
I would only add this, for the original poster: when you read what the book suggests, reflect on why it's doing so.
When I read "Style" the second time around, it occurred to me how hard reading really is, and that all this advice is really for building a sturdy boat to launch your ideas at the distant shores of other minds.
Like, you can have some really bright people working for you, but if you add even a little more nuance, like an "and" and a second clause, you've lost. So the trick appears to be finding a shared language with the people you can think together with.
I felt a jolt of excitement when I overheard a non-Rat (at least looking) person casually drop "Slack" during a conversation.
I work at a mid-sized software company based in the SF Bay area. The person talking was a director in my organization. The context was about setting aside time for chewing over problems--not trying to solve them, but just looking at them to see the broader context in which the problem exists.
I experienced it firsthand not too long ago at the NYC Megameetup: dialogues where both (or more) parties actively tried to explore each others' maps, seeking points where there was overlap and where there were gaps. More concretely, everyone was asking a lot more questions then usual. These questions were relevant and clarifying. They helped make the discussion feel speedy, as in, like we were running from room to room, trying to find interesting bits of knowledge, especially where views diverged.
The best way I can describe it is that it felt like thinking together--like having more people in your head.
I don't think this was because of a large amount of shared references, like in a subculture. I think it was because the culture of LW and LW-adjacent emphasizes curiosity, openness, and respect.
Or a world in which the median dialogue is much more productive?
For me, it would be a world where much less time is wasted producing arguments-as-soldiers. Whether it's in small, day-to-day interactions or in bigger discussions, like around geopolitical conflicts.
Does this explain it better? It still feels a little airy.
I’m still hopeful that there’s some way to make progress if we get enough good minds churning out ideas on how to enroll people into their own personal development.
Me too! I hope my comment didn't come through as cynical or thought-stopping. I think this is one of the highest goods people can produce. It just seems like this is one of those problems where even defining the problem is a wicked problem in the first place--but falling into analysis-paralysis is bad too.
Please do write more on this topic. Ill try to make a post around the same themes this weekend :)
The ending hints at the true problem: how do we go about implementing such change?
We already have more than enough tools: de Bono's thinking hats, 5-whys, CBT & a dozen other forms of therapy (+ drugs!), CFAR, gratitude journaling, meditation, anger management, post-mortems, pre-mortems, legitimate self-help, etc.
But the problem in deploying them are legion:
(I'll stop for the sake of time).
This is something I'm personally interested too. However, I've dialed down my dreams, so to say, and focus on two very local activities that blend in with my personal life:
Now that I think of it, it seems like The Guild of the Rose is a good vehicle for spreading good thinking techniques because it ties them directly with gain in the sense of this Adam Smith quote:
It is not from the benevolence of the butcher, the brewer, or the baker that we expect our dinner, but from their regard to their own self-interest.
This is very interesting work in the Rationalist "I want to be stronger!" sense. Thank you for putting it together and sharing--Ill be on the look out for workshop dates to sign up for!
Also, I just bumped into this game: https://talktomehuman.com/. It's a simple role playing game that puts you into awkward situations at home or work and has you talk your way out of them. The NPCs are LLM-powered. You have to use your voice and there's a time limit. I haven't tried it yet. But it seems similar in spirit to what you want to build, albeit focused on a different problem.