Strange7 comments on Cryonics Questions - Less Wrong
You are viewing a comment permalink. View the original post to see all comments and the full post content.
You are viewing a comment permalink. View the original post to see all comments and the full post content.
Comments (165)
Some of these questions, like the one about running away from a fire, ignore the role of irrational motivation.
People, when confronted with an immediate threat to their lives, gain a strong desire to protect themselves. This has nothing to do with a rational evaluation of whether or not death is better than life. Even people who genuinely want to commit suicide have this problem, which is one reason so many of them try methods that are less effective but don't activate the self-defense system (like overdosing on pills instead of shooting themselves in the head). Perhaps even a suicidal person who'd entered the burning building because e planned to jump off the roof would still try to run out of the fire. So running away from a fire, or trying to stop a man threatening you with a sword, cannot be taken as proof of a genuine desire to live, only that any desire to die one might have is not as strong as one's self-protection instincts.
It is normal for people to have different motivations in different situations. When I see and smell pizza, I get a strong desire to eat the pizza; right now, not seeing or smelling pizza, I have no particular desire to eat pizza. The argument "If your life was in immediate danger, you would want it to be preserved; therefore, right now you should seek out ways to preserve your life in the future, whether you feel like it or not" is similar to the argument "If you were in front of a sizzling piece of pizza, you would want to eat it; therefore, right now you should seek out pizza and eat it, whether you feel like it or not".
Neither argument is inevitably wrong. But first you would have to prove that the urge comes from a reflectively stable value - something you "want to want", and not just from an impulse that you "want" but don't "want to want".
The empirical reason I haven't signed up for cryonics yet is that the idea of avoiding death doesn't have any immediate motivational impact on me, and the negatives of cryonics - weirdness, costs in time and money, negative affect of being trapped in a dystopia - do have motivational impact on me. I admit this is weird and not what I would have predicted about my motivations if I were considering them in the third person, but empirically, that's how things are.
I can use my willpower to overcome an irrational motivation or lack of motivation. But I only feel the need to do that in two cases. One, where I want to help other people (eg giving to charity even when I don't feel motivated to do so). And two, when I predict I will regret my decision later (eg I may overcome akrasia to do a difficult task now when I would prefer to procrastinate). The first reason doesn't really apply here, but the second is often brought out to support cryonics signup.
Many people who signal acceptance of death appear to genuinely go peacefully and happily - that is, even to the moment of dying they don't seem motivated to avoid death. If this is standard, then I can expect to go my entire life without regretting the choice not to sign up for cryonics at any moment. After I die, I will be dead, and not regretting anything. So I expect to go all of eternity without regretting a decision not to sign up for cryonics. This leaves me little reason to overcome my inherent dismotivation to get it.
Some have argued that, when I am dead, it will be a pity, because I would be having so much more fun if I were still alive, so I ought to be regretful even though I'm not physically capable of feeling the actual emotion. But this sounds too much like the arguments for a moral obligation to create all potential people, which lead to the Repugnant Conclusion and which I oppose in just about all other circumstances.
That's just what I've introspected as the empirical reasons I haven't signed up for cryonics. I'm still trying to decide if I should accept the argument. And I'm guessing that as I get older I might start feeling more motivation to cheat death, at which point I'd sign up. And there's a financial argument that if I'm going to sign up later, I might as well sign up now, though I haven't yet calculated the benefits.
But analogies to running away from a burning building shouldn't have anything to do with it.
By the way, I'm not here to troll, and I do have a serious question that doesn't necessarily have to do with cryonics. The goal of SIAI (Lesswrong, etc) is to learn and possibly avoid a dystopian future. If you truly are worried about a dystopian future, then doesn't that serve as a vote of "No confidence" for these initiatives?
Admittedly, I haven't looked into your history, so that may be a "Well, duh" answer :)
Let's say you're about to walk into a room that contains an unknown number of hostile people who possibly have guns. You don't have much of a choice about which way you're going, given that the "room" you're currently in is really more of an active garbage compactor, but you do have a lot of military-grade garbage to pick through. Do you don some armor, grab a knife, or try to assemble a working gun of your own?
Trick question. Given adequate time and resources, you do all three. In this metaphor, the room outside is the future, enemy soldiers are the prospect of a dystopia or other bad end, AGI is the gun (least likely to succeed, given how many moving parts there are and the fact that you're putting it together from garbage without real tools, but if you get it right it might solve a whole room full of problems very quickly), general sanity-improving stuff is the knife (a simple and reliable way to deal with whatever problem is right in front of you), and cryonics is the armor (so if one of those problems becomes lethally personal before you can solve it, you might be able to get back up and try again).
No. AI isn't a gun; it's a bomb. If you don't know what you're doing, or even just make a mistake, you blow yourself up. But if it works, you lob it out the door and completly solve your problem.
A poorly put together gun is perfectly capable of crippling the wielder, and most bombs light enough to throw won't reliably kill everyone in a room, especially a large room. Also, guns are harder to get right than bombs. That's why, in military history, hand grenades and land mines came first, then muskets, then rifles, instead of just better and better grenades. That's why the saying is "every Marine is a rifleman" and not "every Marine is a grenadier."
A well-made Friendly AI would translate human knowledge and intent into precise, mechanical solutions to problems. You just look through the scope and decide when to pull the trigger, then it handles the details of implementation.
Also, you seem to have lost track of the positional aspect of the metaphor. The room outside represents the future; are you planning to stay behind in the garbage compactor?
That's the iffy part.
So start with a quick sweep for functional-looking knives, followed by pieces of armor that look like they'd cover your skull or torso without falling off. No point to armor if it fails to protect you, or hampers your movements enough that you'll be taking more hits from lost capacity to dodge than the armor can soak up.
If the walls don't seem to have closed in much by the time you've got all that located and equipped, think about the junk you've already searched through. Optimistically, you may by this time have located several instances of the same model of gun with only one core problem each, in which case grab all of them and swap parts around (being careful not to drop otherwise good parts into the mud) until you've got at least one functional gun. Or, you may not have found anything that looks remotely like it could be converted into a useful approximation of a gun in the time available, in which case forget it and gather up whatever else you think could justify the effort of carrying it on your back.
Extending the metaphor, load-bearing gear is anything that lets you carry more of everything else with less discomfort. By it's very nature, that kind of thing needs to be fitted individually for best results, so don't just settle for a backpack or 'supportive community' that looks nice at arm's length but aggravates your spine when you actually try it on, especially if it isn't adjustable. If you've only found one or two useful items anyway, don't even bother.
Medical supplies would be investments in maintaining your literal health as well as non-crisis-averting skills and resources, so you're less likely to burn yourself out if one of those problems gets a grazing hit in. You should be especially careful to make sure that medical supplies you're picking out of the garbage aren't contaminated somehow.
Finally, a grenade would be any sort of clever political stratagem which could avert a range of related bad ends without much further work on your part, or else blow up in your face.