I guess I'd recommend the AGI safety fundamentals course: https://www.eacambridge.org/technical-alignment-curriculum
On Stuart's list: I think this list might be suitable for some types of conceptual alignment research. But you'd certainly want to read more ML for other types of alignment research.
This is nice from a "what do I need to study" perspective, but it does help less with the "how do I pay the bills" perspective. Do you have pointers there too?
I also found this thread of math topics on AI safety helpful.
Recently, I had a conversation with someone from a math background, asking how they could get into AI safety research. Based on my own path from mathematics to AI alignment, I recommended the following sources. It may prove useful to others contemplating a similar change in career:
You mileage may vary, but these are the sources that I would recommend. And I encourage you to post any sources you'd recommend, in the comments.