wedrifid comments on Just Lose Hope Already - Less Wrong

47 Post author: Eliezer_Yudkowsky 25 February 2007 12:39AM

You are viewing a comment permalink. View the original post to see all comments and the full post content.

Comments (76)

Sort By: Old

You are viewing a single comment's thread. Show more comments above.

Comment author: Eliezer_Yudkowsky 12 January 2011 11:43:32AM 6 points [-]

No. If everything else we believe about the universe stays true, and humanity survives the next century, cryonics should work by default. Are there a number of things that could go wrong? Yes. Is the disjunction of all those possibilities a large probability? Quite. But by default, it should simply work. Despite various what-ifs, ceteris paribus, adding carbon dioxide to the atmosphere would be expected to produce global warming and you would need specific evidence to contradict that. In the same way, ceteris paribus, vitrification at liquid nitrogen temperatures ought to preserve your brain and preserving your brain ought to preserve your you, and despite various what-ifs, you would need specific evidence to contradict that it because it is implied by the generalizations we already believe about the universe.

Comment author: wedrifid 12 January 2011 12:42:19PM 7 points [-]

Everything you say after the 'No." is true but doesn't support your contradiction of:

I can't think of one single case in my experience when the argument "It has a small probability of success, but we should pursue it, because the probability ifwe don't try is zero" turned out to be a good idea.

Er ... isn't that the argument for cryonics?

There is no need to defend cryonics here. Just relax the generalisation. I'm surprised you 'can't think of a single case in your experience' anyway. It took me 10 seconds to think of three in mine. Hardly surprising - such cases turn up whenever the payoffs multiply out right.

Comment author: shokwave 12 January 2011 12:55:06PM 3 points [-]

I think the kind of small probabilities Eliezer was talking about (not that he was specific) here is small in the sense that there is a small probability that evolution is wrong, there is a small probability that God exists, etc.

The other interpretation is something like there is a small probability you will hit your open-ended straight draw (31%). If there are at least two players other than you calling, though, it is always a good idea to call (excepting tournament and all-in considerations). So it depends on what interpretation you have of the word 'small'.

By the first definition of small (vanishing), I can't think of a single argument that was a good idea. By the second, I can think of thousands. So, the generalisation is leaky because of that word 'small'. Instead of relaxing it, just tighten up the 'small' part.

Comment author: wedrifid 12 January 2011 01:46:52PM 2 points [-]

Redefinition not supported by the context.

Comment author: shokwave 12 January 2011 01:53:32PM 0 points [-]

I already noted that Eliezer was not specific enough to support that redefinition. I was offering an alternate course of action for Eliezer to take.

Comment author: wedrifid 13 January 2011 06:49:26AM 1 point [-]

That would certainly be a more reasonable position. (Except, obviously, where the payoffs were commensurably large. That obviously doesn't happen often. Situations like "3 weeks to live, can't afford cryonics are the only kind of exception that spring to mind.")

Comment author: Eliezer_Yudkowsky 12 January 2011 07:56:15PM 1 point [-]

Name one? We might be thinking of different generalizations here.

Comment author: wedrifid 13 January 2011 07:09:09AM 6 points [-]

We might be thinking of different generalizations here.

Almost certainly. I am specifically referring to the generalisation quoted by David. It is, in fact, exactly the reasoning I used when I donated to the SIAI. Specifically, I estimate the probability of me or even humanity surviving for the long term if we don't pull off FAI to be vanishingly small (like that of winning the lottery by mistake without buying a ticket) so donated to support FAI research even though I think it to be, well, "impossible".

More straightforward examples crop up all the time when playing games. Just last week I bid open misere when I had a 10% chance of winning - the alternatives of either passing or making a 9 call were guaranteed losses of the 500 game.