In a comment on his skeptical post about Ray Kurzweil, he writes,
Unfortunately, [Kurzweil's] technological forecasting is naive, and I believe it will also prove erroneous (and in that, he is in excellent company). That would be of no consequence to me, or to others in cryonics, were it not for the fact that it has had, and continues to have, a corrosive effect on cryonics and immortalist activists and activism. His idea of the Singularity has created an expectation of entitlement and inevitability that are wholly unjustified, both on the basis of history, and on on the basis of events that are playing out now in the world markets, and on the geopolitical stage....
The IEET poll [link; Sep 7, 2011] found that the majority of their readers aged 35 or older said that they expect to “die within a normal human lifespan;” no surprises there.
This was in contrast to to an overwhelming majority (69%) of their readers under the age of 35 who believe that radical life extension will enable them to stay alive indefinitely, or “for centuries, at least.”
Where the data gets really interesting is when you look at the breakdown of just how these folks think they are going to be GIVEN practical immortality:
- 36% believe they will stay alive for centuries (at least) in their own (biological) bodies
- 26% expect that they will continue to survive by having their “minds uploaded to a computer”
- 7% expect to “die” but to eventually be resurrected by cryonics.
Only 7% think cryonics will be necessary? That simply delusional and it is a huge problem....
Nor are the 7% who anticipate survival via cryonics likely to be signed up. In fact, I’d wager not more than one or two of them is. And why should they bestir themselves in any way to this end? After all, the Singularity is coming, it is INEVITABLE, and all they have to do is to sit back and wait for it to arrive – presumably wrapped up in in pretty paper and with bows on.
Young people anticipating practical immortality look at me like some kind of raving mad Luddite when I try to convince them that if they are to have any meaningful chance at truly long term survival, they are going to have to act, work very hard, and have a hell of a lot of luck in the bargain....
Kurzweil has been, without doubt or argument, THE great enabler of this madness by providing a scenario and a narrative that is far more credible than Santa Claus, and orders of magnitude more appealing.
I wonder how people on Less Wrong would respond to that poll?
Edit: (Tried to) fix formatting and typo in title.
If I say I am confused, then I mean that I am confused.
I mean that I take it as an axiomatic principle that my conversants are honest and rational actors until such time as they demonstrate otherwise.
Internally inconsistent means that the statements contradict themselves. Externally inconsistent means the statements contradict the known facts outside the statements.
You and I have the same datasets available for this conversation. You claim that you have read Stipp's book and yet you still claim that there has been strong historical interest in antiagapics research within the mainstream community. Stipps book contradicts this claim.
This is an internally inconsistent claim. You then go on to make many externally inconsistent statements such as claiming that the question of whether humans operate on calorie-restricted metabolism is yet at question, or the claim that geriontology's studies of human longevity have sufficiently little to do with determining the maximal human lifespan that you are confused by why I would even bring it up.
These are all points that tend to lead towards the conclusion of dishonesty or irrationality on your part. I'm not trying to claim that I have made that conclusion, just that I am confused as to how it is possible that you are not being dishonest or irrational -- because I am continuing to operate on the axiomatic assumption that you are in fact honest and rational.
No, it doesn't. No pharmaceutical will receive widespread adoption until such time as it has been rigorously studied for how it behaves and what its contraindications and/or interactions are. That includes diets. These are all things that are normally controlled for. There is nothing "remotely plausible" about your proposed scenario: the entire pharmaceutical approvals process would have to be abandoned for it to occur.
There isn't any one single this, is my point. Science and medicine are converging significantly. There is now a concerted effort to solving this particular problem. The scale of extension is, compared to what is conceivable possible, very minor.
My claim is that there are so many things which can be stated to be very likely to work that the idea of all of them failing would require a total overhaul of several fundamental models that I hold to be true based on their history of providing valid conclusions.
I don't believe that the problem we have here is one of inferential distance. I am very familiar with what that problem looks like. What we have instead is the fact that somehow we are both operating with the same sets of available data yet reaching different conclusions.
Aumann's Agreement Theorem has something to say about that -- and with that I suppose I am now verging into the territory of claiming dishonesty/irrationality on your part. (I admit it could conceivable be on my part as well, but I have as yet no indications towards inconsistency in any of my statements aside from the assumption of rational-honesty on your part.)
I don't know precisely what you mean by rational in this context. Given your invocation of Aumann's theorem below, I presume you mean something like "close to perfect Bayesians." This is a really bad idea. Humans can try to be more rational, but they are far from rational. This is not only a bad assumption about people around you, it is a bad assumption about youself. Even weaker assumptions of ratio... (read more)