Luke Muehlhauser writes:
Over the years, my colleagues and I have spoken to many machine learning researchers who, perhaps after some discussion and argument, claim to think there’s a moderate chance — a 5%, or 15%, or even a 40% chance — that AI systems will destroy human civilization in the next few decades. 1 However, I often detect what Bryan Caplan has called a “missing mood“; a mood they would predictably exhibit if they really thought such a dire future was plausible, but which they don’t seem to exhibit. In many cases, the researcher who claims to think that medium-term existential catastrophe from AI is plausible doesn’t seem too upset or worried or sad about it, and doesn’t seem to be taking any specific actions as a result.
Not so with Elon Musk. Consider his reaction (here and here) when podcaster Joe Rogan asks about his AI doomsaying. Musk stares at the table, and takes a deep breath. He looks sad. Dejected. Fatalistic. Then he says:
I'll make an analogy here as to get around the AI-worship induced gut reactions:
I think most people are fairly convinced there isn't a moral imperative beyond their own life, as in, even if behaving as if your own life is the ultimate driver of moral value is wrong and ineffective, from a logical standpoint it is, once your conscious experience ends everything ends.
I'm not saying this is certain, it may be that the line between conscious states is so blurry that continuity between sleep and awakenes is basically 0, or as much as that between you can other completely different humans (which will be alive even once you die and will keep on flourishing). It may be that there is a ghost in the machine under whatever metaphysical framework you want... but, if I had to take a bet, I'd say something like ... 15,40,60% chance that once you close your eyes is over, the universe is done for.
I think many people accept this viewpoint, but most of them don't spend even a moment thinking about anti-aging, even those like myself that do, aren't to concerned about death in a "mood" sense. Why would you be? It's inevitable, like, yeah, your actions might contribute to averting death by 0.x% if you're very lucky and so you should pursue that area because... well, nothing better to do, right? But it makes no sense to concern oneself about death in an emotional way since it's likely coming anyway.
After all the purpose of life is living, and if you're not living because you're worrying about death you lost, even in the case where you were able to defeat death, you still lost, you didn't live, or less metaphorically you lived a life of suffering, or of unmeet potential.
Nor does it help to be parallelized by the fear of death every waking moment of one's life. It will likely make you less able to destory the very evil you are oposing.
Such is the case with every potential horrible inevitability in life, even if it is "absolute" in it's bad-ness, being afraid of it will not make ir easier to avoid and it might ultimately defeat the purpose of avoiding it, which is the happiness of you and the people you care about, since all of those will be more miserable if you are paralleized by fear.
So even if whatever fake model you had assumed a 99.9% chance of being destroyed by HAL or whatever in 10 years from now, it would still be the most sensible course of action to not get too emotional about the whole thing.