Epistemic Status: Wisdom gleaned from years of meditation.
There are many ways we could create societal collapse.
Creating an engineered pandemic, triggering a nuclear catastrophe, developing a computer virus to permanently crash the global economy etc.
Now you may be thinking about trying out one of these. Most value is in the far future and, in a collapse scenario, we would probably be more likely to sustain moral progress than technological progress (because 'moral infrastructure' is more likely to stay intact than technological and global trade infrastructure). Creating a moderate collapse scenario, with a high level of depopulation, will definitely give us more time and probably more societal wisdom to focus on getting past the age of perils- in particular, solving AI alignment. And in the short term, it might seem morally desirable as well, because the net harm caused by humans, such as animal suffering on factory farms, will be greatly reduced. You might also be an antinatalist, and (reasonably) believe that most of our lives are net negative.
But there are a few reasons that this strategy might not be a good idea:
1. You might go wrong and kill everybody. Fine, so developing a contagious virus that kills 95% of the population might seem like a win-win- it would slow everything down, end most factory farming, and give us another century to rebuild and focus on alignment. But what if the virus mutates and accidentally kills 99.9% of the population, leaving only the most remote and uncontacted tribes (who aren't party to our moral progress) alive? Or what if it kills too few, and just makes certain actors angrier or more desperate to develop violent, weaponised AGI?
2. They might lock in worse institutions/ forms of governance. Of course, we will probably be able to build on our moral progress in the case of a collapse, but institutions are sticky. A collapse scenario is likely to lead to varied outcomes between today's political units. If any of these survive, it is hard to predict which will come out of the collapse stronger. If your planned collapse scenario ends up destroying every political unit except an elite bunch of the CCP with bunker access, this could be even riskier than the current predicament!
3. Reputational damage. If you try to create an engineered pandemic, trigger a nuclear catastrophe or otherwise strive for societal collapse and succeed, it's not too much of an issue- your reputation will be the least of your worries. But if you fail, the reputational costs to yourself and your movement could be so serious that much of the good work you do will likely be undone by your recklessness.
4. We might have locked in low population growth. Birth rates have dropped as a result of attitudinal changes (towards female labour force participation, preference for large family sizes etc.). Unless collapse really takes us back to pre-modern values, it's likely that we won't have the societal incentives to propagate and rebuild a more enlightened society.
5. Do-no-harm principle. All things equal, if we can manage to stop misaligned TAI, and end factory farming, without causing extreme pain and suffering around the world through deliberate collapse, we probably should. It might be more difficult, but we should be very sure that solving the problem is unlikely before we choose the nuclear option. Some forms of collapse could also lead to worse forms of human and animal suffering, and there's a high level of unpredictability.
So, all in all, I think deliberately striving for societal collapse is probably (60%) not a good idea, so I don't recommend this as a plan/ career path.
Thanks! I'm actually a more serious EA type in everyday life, but my lesswrong alter ego is proudly Kakistocurious.