wedrifid comments on Just Lose Hope Already - Less Wrong
You are viewing a comment permalink. View the original post to see all comments and the full post content.
You are viewing a comment permalink. View the original post to see all comments and the full post content.
Comments (76)
No. If everything else we believe about the universe stays true, and humanity survives the next century, cryonics should work by default. Are there a number of things that could go wrong? Yes. Is the disjunction of all those possibilities a large probability? Quite. But by default, it should simply work. Despite various what-ifs, ceteris paribus, adding carbon dioxide to the atmosphere would be expected to produce global warming and you would need specific evidence to contradict that. In the same way, ceteris paribus, vitrification at liquid nitrogen temperatures ought to preserve your brain and preserving your brain ought to preserve your you, and despite various what-ifs, you would need specific evidence to contradict that it because it is implied by the generalizations we already believe about the universe.
Everything you say after the 'No." is true but doesn't support your contradiction of:
There is no need to defend cryonics here. Just relax the generalisation. I'm surprised you 'can't think of a single case in your experience' anyway. It took me 10 seconds to think of three in mine. Hardly surprising - such cases turn up whenever the payoffs multiply out right.
I think the kind of small probabilities Eliezer was talking about (not that he was specific) here is small in the sense that there is a small probability that evolution is wrong, there is a small probability that God exists, etc.
The other interpretation is something like there is a small probability you will hit your open-ended straight draw (31%). If there are at least two players other than you calling, though, it is always a good idea to call (excepting tournament and all-in considerations). So it depends on what interpretation you have of the word 'small'.
By the first definition of small (vanishing), I can't think of a single argument that was a good idea. By the second, I can think of thousands. So, the generalisation is leaky because of that word 'small'. Instead of relaxing it, just tighten up the 'small' part.
Redefinition not supported by the context.
I already noted that Eliezer was not specific enough to support that redefinition. I was offering an alternate course of action for Eliezer to take.
That would certainly be a more reasonable position. (Except, obviously, where the payoffs were commensurably large. That obviously doesn't happen often. Situations like "3 weeks to live, can't afford cryonics are the only kind of exception that spring to mind.")
Name one? We might be thinking of different generalizations here.
Almost certainly. I am specifically referring to the generalisation quoted by David. It is, in fact, exactly the reasoning I used when I donated to the SIAI. Specifically, I estimate the probability of me or even humanity surviving for the long term if we don't pull off FAI to be vanishingly small (like that of winning the lottery by mistake without buying a ticket) so donated to support FAI research even though I think it to be, well, "impossible".
More straightforward examples crop up all the time when playing games. Just last week I bid open misere when I had a 10% chance of winning - the alternatives of either passing or making a 9 call were guaranteed losses of the 500 game.