Vladimir_Nesov comments on The mind-killer - Less Wrong

23 Post author: ciphergoth 02 May 2009 04:49PM

You are viewing a comment permalink. View the original post to see all comments and the full post content.

Comments (151)

You are viewing a single comment's thread. Show more comments above.

Comment author: Vladimir_Nesov 15 May 2009 08:02:47AM *  0 points [-]

So, where did you get those numbers from? 10^-6? 10^-2? Why not, say, 1-10^-6 instead? Gut feeling again, and that's inevitable. You either name a number, or make decisions without the help of even this feeble model, choosing directly. From what people on this site know, they believe differently from you.

I have one of the lowest estimates, 30% for not killing off 90% of the population by 2100. Most of it comes from Unfriendly AI, with estimate of 50% of AGI foom by 2070, or 70% by 2100 (expectation of relatively low-hanging fruit, it levels off as time goes on) if nothing goes wrong with the world, 3/4 of that to Unfriendly AI, given my understanding of how hard it is to find the right answer from many efficient world-eating possibilities, and human irrationality, making it likely that the person to invent the first mind won't think about the consequences. That's already 55% total extinction risk, add some more for biological (at least, human-inhabiting) weapons, such as an engineered pandemic (not total extinction, but easily 90%), and new possible goodies the future has to offer. It'll only get worse until it gets better. On second thought, I should lower my confidence from these explicit models, they seem too much like planning. Make that 50%.