How can I maximise my chances of having a decent life, given the very high likelihood that GAI will make all our intellectual labour useless in the next few years?
For example, I graduated from a good university a few years ago and am working as a software engineer in a multinational company, but my capabilities are middling at best. I am distressed that I will likely not be able to afford a house in the few years left before GAI renders me unable to afford a living. I am not a genius; it is very unlikely that I can join an AI research company and contribute meaningfully to AI research.
Assuming I have a small amount of money (100-200k) I can set aside, should I attempt to, for example, invest in companies that will likely be able to monetise GAI?
Or is there something else I should be doing to prepare for the time I have basically zero human capital?
Should I attempt to move to (and get citizenship from) a country with a larger amount of natural resources, assuming that human capital will become worthless quickly?
Is it reasonable to find potential outs (e.g. physician-assisted death) in case we cannot earn a living (and if unfriendly AI is basically confirmed)?
The actions you suggest might represent a laudable contribution to a public good, but it doesn't directly answer the (self-concerned) question the OP raises. Given the largeness of the world and the public goods nature of the projects you mention, his own action will only marginally change the probability of a a better structure of society in general. That may still be worth it from a fully altruistic standpoint, but it has asymptotically 0 probability to improve his personal material welfare.
(If I may, an analogy: One could compare it to a situation where I'm living in Delhi and I wonder what I can do to save myself from the effects of climate change that makes the summers even more unbearable in future, and you tell me "consider not flying; plant trees in the city or in the country...". I see people or small countries get such type of advice in reality also, but it is not truly to the point.)