I wonder what will we hear from non-LW rationalists about the SIAI when it gains enough prominence. I think its pretty easy to predict...
I don't really want to watch that live though - some day some genuine technological danger, AI related or not, may be actually foreseen - and then actions of boys who read in fiction about Chupacabra and then cry wolf (and get candy on request for the clarity of their cries) are going to ever so slightly raise existential risk (if widely rebutted).
Note that cryonics is pretty incidental to rationality. If anything, having people live forever is likely to slow down the progress due to increased odds of gerontocracy, and thus be detrimental to the survival of the human species. EY reflects on this in his "3 worlds collide" story.
Question:
His response:
Link