CarlShulman comments on Q&A with Richard Carrier on risks from AI - Less Wrong Discussion
You are viewing a comment permalink. View the original post to see all comments and the full post content.
You are viewing a comment permalink. View the original post to see all comments and the full post content.
Comments (22)
Sorry, I misread your question. I don't think we have rigid uncontroversial frequentist estimates for any man-made extinction event. There are estimates I would say are unreasonably low, but there will be a step along the lines of "really?!? You seriously assign less than a 1 in 1 billion probability to there being a way for bioweapons programs of the next 50 years to create a set of overlapping long-latency high virulence pathogens that would get all of humanity, in light of mousepox and H5N1 experiments, the capabilities of synthetic biology, the results of these expert elicitations, etc?"