If AI starts to end the world, is suicide a good idea?
For a while I’ve thought to myself that if AI starts to obviously end the world I would just commit suicide, mainly to avoid any potential s-riskiness. But I’ve become far less certain recently that that would be a good idea. Between the possibility of resurrection, quantum immortality, and weird...
Have your probabilities for AGI on given years changed at all since this breakdown you gave 7 months ago? I, and I’m sure many others, defer quite a lot to your views on timelines, so it would be good to have an updated breakdown.
15% - 2024
15% - 2025
15% - 2026
10% - 2027
5% - 2028
5% - 2029
3% - 2030
2% - 2031
2% - 2032
2% - 2033
2% - 2034
2% - 2035