timtyler comments on Q&A with Richard Carrier on risks from AI - Less Wrong Discussion
You are viewing a comment permalink. View the original post to see all comments and the full post content.
You are viewing a comment permalink. View the original post to see all comments and the full post content.
Comments (22)
Hmm. Information theoretic extinction seems pretty unlikely to me. Humanity will live on in the history "books" about major transitions - and the "books" at that stage will no doubt be pretty fancy - with multiple "instantiated" humans.
I don't think that's very likely either, but 10^-20 seems to be an overconfident probability for it.
A rather bizarre view, IMHO. I think that only a few would agree with this.