homunq comments on The mind-killer - Less Wrong
You are viewing a comment permalink. View the original post to see all comments and the full post content.
You are viewing a comment permalink. View the original post to see all comments and the full post content.
Comments (151)
I myself would be disappointed if over half of LW put the probability of a single biological human (not an upload, not a reconstruction - an actual descendent with the appropriate number of ancestors alive today) alive in 100 years under 95%. I would consider that to be a gross instance of all kinds of biases. I'm not going to argue about scenarios, here, just point out that there any scenario which tends inevitably to wipe out humanity within one lifetime is totally unimaginable. That doesn't mean implausible, but it does mean improbable.
Personally, I do not believe that any person, group of people, or human-built model to date can consistently predict the probability of defined classes of black-swan events ("something that's never happened before which causes X" where X is a defined consequence such as humanity's extinction) to within even an order of magnitude for p/(1-p). I doubt anybody can get even to within two orders of magnitude consistently. (I also doubt that this hypothesis of mine will be clearly decidable within the next 20 years, so I'm not particularly inclined to listen to philosophical arguments from people who'd like to discard it.)
What I'm saying is, we should stop trying to put numbers on this without big error bars. And I've yet to see anybody propose an intelligent way to deal with probabilities like 10^(-6 +/- 4); just meta-averaging it over the distribution of possible probabilities, to come up with something like 10^-3 seems to be discarding data and to lead to problems. However, that's the kind of probability I'd put on this lemma. ("Earth made uninhabitable by normal cosmic event and rescue plans fail" would probably put a floor somewhere above 10^-22 per year.)
"The chance we're all wrong about something totally unprecedented has got to be less than 99.9%" is total hubris. Yes, totally unprecedented things happen every day. But telling yourselves stories about AGI and foom does not make these stories likely.
This is not, by the way, an argument to ignore existential risk. Even at the 10^-6 (or, averaged over meta-probabilities, 10^-3) level which I estimated, it is clearly worth thinking about, given the consequences. But if you're all getting that carried away, then Less Wrong should just be renamed More Wrong.
Oh, also, I'd accept that the risk of humanity being seriously hosed within 100 years, or extinct within 1000 years, is significant - say, 10^(-3 +/- 4) which meta-averages to something like 15%.
("Seriously hosed" means gigadeath events, total enslavement, or the like. Note that we're already moderately hosed and always have been, but that seriously hosed is still distinguishable.)