I don't really know of any ai researchers in our extended network out of some dozens who've managed to be taken very seriously without being colocated with other top researchers, so without knowing more, it still seems moderately likely to me that the best plan involves doing something like earning while practising math, or studying a PhD, with the intent to move in 2-3 years, depending on when you can't move until.
Otherwise, it seems like you're doing the right things, but until you put out some papers or something, I think I'd sooner direct funding to projects among the FLI grantees. I'd note that most of the credible LW/EA researchers are doing PhDs and postdocs or taking on AI safety research roles in industry, and recieve funds through those avenues and it seems to me like those would also be the next steps for you in your career.
If you had a very new idea that you had an extraordinary comparative advantage at exploring, then it's not concievable that you could be among the most eligible GCR-reduction researchers for funding but you'd have to say a lot more.
(I'm re-posting my question from the Welcome thread, because nobody answered there.)
I care about the current and future state of humanity, so I think it's good to work on existential or global catastrophic risk. Since I've studied computer science at a university until last year, I decided to work on AI safety. Currently I'm a research student at Kagoshima University doing exactly that. Before April this year I had only little experience with AI or ML. Therefore, I'm slowly digging through books and articles in order to be able to do research.
I'm living off my savings. My research student time will end in March 2017 and my savings will run out some time after that. Nevertheless, I want to continue AI safety research, or at least work on X or GC risk.
I see three ways of doing this:
Oh, and I need to be location-independent or based in Kagoshima.
I know http://futureoflife.org/job-postings/, but all of the job postings fail me in two ways: not location-independent and requiring more/different experience than I have.
Can anyone here help me? If yes, I would be happy to provide more information about myself.
(Note that I think I'm not in a precarious situation, because I would be able to get a remote software development job fairly easily. Just not in AI safety or X or GC risk.)