In this post you can make several predictions for how different factors affect the probability that the creation of AGI leads to an extinction level catastrophe. This might be useful for planning.
Please let me know if you have other ideas for questions that could be valuable to ask.
Predictions based on who develops AGI:
Predictions based on technology used for developing AGI:
Prediction based on approach for creating AGI:
Predictions on how money affects probability of AGI X-risk:
Excellent point.
I do think that the first AGI developed will have a big effect on the probability of doom, so hopefully it will be some value possible to derive from the question. But it would be interesting to control for what other AIs do, in order to get better calibrated statistics.