We talk about a wide variety of stuff on LW, but we don't spend much time trying to identify the very highest-utility stuff to discuss and promoting additional discussion of it. This thread is a stab at that. Since it's just comments, you can feel more comfortable bringing up ideas that might be wrong or unoriginal (but nevertheless have relatively high expected value, since existential risks are such an important topic).
So trying to increase the number of people who think about and work on x-risk and see it as a high priority would be one. Efforts to raise general rationality would be another. MIRI does sort-of represent a general strategy against existential risk, since if they are successful the problem will likely be taken care of.