There have been several cults in LW history, whether it's the Zizians, or Leverage cults.
Today I want to talk about these cults and the tradeoffs LW makes.
Zizians
The Zizians were a cult that focused on relatively extreme animal welfare, even by EA standards, and used a Timeless/Updateless decision theory, where being aggressive and escalatory was helpful as long as it helped other world branches/acausally traded with other worlds to solve the animal welfare crisis.
They apparently made a new personality called Maia in Pasek, and this resulted in Pasek's suicide.
They also used violence or the threat of violence a lot to achieve their goal.
This caused many problems for Ziz, and she now is in police custody.
Edit: Removed the Vassarites due to updated information that they didn't use psychedelics to jailbreak people's minds.
Now I'll talk about what in my view are the tradeoffs of the level of cults.
And I think one tradeoff is that the less cults you have, the more you miss weird opportunities that generate most of the value.
EDIT: I have removed the takeaways section due to the fact that we don't have base rates for people entering cults..
I like something about this formulation? No idea if you have time, but I'd be interested if you expanded on it.
I'm not convinced "high-energy" is the right phrasing, since the attributes (as I seem them) seem to be:
I would add that some people seem to have a tendency to take what is usually a low-energy meme in most hands, and turn it into a high-energy form? I think this is an attribute that characterizes some varieties of charisma, and is common among "reality warpers."
(Awkwardly, I think "mapping high-minded ideas to practical behaviors" is also an incredibly useful attribute of highly-practical highly-effective people? Good leaders are often talented at this subskill, not just bad ones. Discernment in what ideas you take seriously, can make a really big difference in the outcomes, here.)
Some varieties of couching or argumentation will push extreme change in behavior and action, harder than others, for the same idea. Some varieties of receptivity and listening, seem more likely to uptake ideas as high-energy memes.
I feel like Pascal's Mugging is related, but not the only case. Ex: Under Utilitarianism, you can also justify a costly behavior by arguing from very high certainty of a moderate benefit. However, this is usually not as sticky, and it is more likely to rapidly right itself if future data disputes the benefit.