Less Wrong is a community blog devoted to refining the art of human rationality. Please visit our About page for more information.

Comment author: Mark_Neznansky 28 April 2009 02:47:27PM 0 points [-]

It seems to me you use wrong wording. In contrary to the epistemic rationalist, the instrumental rationalist does not "gain" any "utility" from changing his beliefs. He is gaining utility from changing his action. Since he can either prepare or not prepare for a meteoritic catastrophe and not "half prepare", I think the numbers you should choose are 0 and 1 and not 0 and 0.5. I'm not entirely sure what different numbers it will yield, but I think it's worth mentioning.

Comment author: Mulciber 28 April 2009 08:29:27PM 0 points [-]

Why does it sound more like 1 than .5? If I believed the probability of my home getting struck by a meteorite was as high as .5, I would definitely make preparations.

Comment author: badger 24 April 2009 11:54:03PM 2 points [-]

Rationalist

Comment author: Mulciber 25 April 2009 12:14:36AM *  -2 points [-]

That sounds like someone who rationalizes, which is something we should be avoiding.

It's weird that trying to rationalize something can go against rationality, but that's English for you.

Edit: I assume this was downvoted so heavily because I failed do the constructive thing by providing a suggestion of my own. Sorry about that.

How about something based on the name of the site? LWite, LWer, LWan? Or maybe a more pronounceable version like Lewite, etc.

Comment author: Jack 24 April 2009 08:48:42PM *  0 points [-]

That is a good point. But progress matter because there is a non-zero chance that some disaster strikes, or the cryogenics firm dissolves and you never get revived. I also think the farther into the future you get the less interested future people will be in reviving (by comparison) the mentally inferior. Plus I'd much rather wake up sooner than later since I'd rather not be so far behind my new contemporaries. So confidence that revival will be possible sooner than later increases the incentive to pay for the procedure.

Edit- also, the longer revivification technology takes the more likely the chances are for one of alicorn's dystopian scenarios. Plus the far future might be throughly repugnant to the values of the present day, even if it isn't a dystopia.

Comment author: Mulciber 24 April 2009 08:54:59PM 0 points [-]

I also think the farther into the future you get the less interested future people will be in reviving (by comparison) the mentally inferior.

This sounds possible but not at all obvious. It seems to me that so far, interest in historical people and compassion for the mentally inferior have if anything increased over time. This certainly doesn't mean they'll continue to do so out into the far future, but it does mean I'd need some really good reasons to support expecting them to.

Comment author: Psy-Kosh 24 April 2009 08:34:35PM 0 points [-]

For everyone? Well, there'd also be the cost of building the facilities... Anyways, maybe we really should try to push something like that? (Yeah yeah, I know, unlikely.)

Anyways, did you get the PM I sent? (About talking me through some of the specifics of actually signing up?)

Comment author: Mulciber 24 April 2009 08:38:39PM 0 points [-]

Cost of facilities per person should go down significantly as the number of people gets large, right?

Comment author: jhuffman 24 April 2009 07:36:37PM -1 points [-]

I don't really see any commentary on the underlying assumptions here made about the badness of being dead. In summary for a physicalist, being dead has no value: it is a null state. Null states cannot be compared with non-null states, so being dead is not worse than being alive.

To put that another way, I cannot be worse off by being dead because there won't be an I at that point. An argument can be made that I have no personal interest in my being dead - only other living people have a stake in that. That doesn't change the fact that I want to live. There is an I here that wants this, and wants it indefinitely. But once I'm gone, its not a problem for me.

So I tend to favor arguments related to organ-donation since future living people are unlikely to get more benefit from me than current living people in need of organ transplants.

Also, there is a real but small chance that cryo-preservation could lead to a sort of hell - what if I'm only thawed to be a permanent exhibit in a zoo or to be experimented upon or subjected to conversations with classical-language enthusiasts.

So there is a non-zero chance of being consigned to hell if I'm cryo-preserved; whereas once I'm dead its a null state and can be considered an even-break if you really must try and attach a value to it.

Comment author: Mulciber 24 April 2009 08:35:19PM 3 points [-]

It's counterintuitive to say that being dead is basically null value. If I'm choosing between two courses of action, and one difference is that one of them will involve me dying, that's a strong factor in making me prefer the other option.

I can think of possible explanations for this that preserve the claim that being dead has value zero, but I'm not seeing a way that would do so only in non-cryonics cases.

Comment author: Cameron_Taylor 24 April 2009 04:52:35PM 0 points [-]

Shouldn't we instead be attempting to make ourselves more useful to the community?

That's the thing. Controlling things with 'shoulds' is unstable without the presence of real consequences, social or otherwise. Anonymous internet forums do not have these real consequences naturually, which is what gives Karma a purpose. It is a way to allow social control and influence with the minimum of overhead and perceived oppression.

Comment author: Mulciber 24 April 2009 08:27:04PM 0 points [-]

Right. The whole point is that what karma really controls for is appearing useful to the community, not being useful to the community.

I agree that it has a purpose, and that we're better off with it. I don't think it's sufficient on its own, and we shouldn't fool ourselves into thinking that obsessing over it is the same as focusing on improving the community. At best, it improves only a small aspect of the community; at worst, the subgoal "think about karma and get points" takes over at the expense of all else.

Comment author: Alicorn 24 April 2009 05:31:10PM *  1 point [-]

Sadists exist even in the present. Unethical research programs are not unheard of in history. This is a little like saying that I shouldn't worry about walking alone in a city at night in an area of uncertain crime rate, because if someone benevolent happens by they'll buy me ice cream, and anyone who doesn't wish me well will just ignore me.

Comment author: Mulciber 24 April 2009 08:24:45PM 1 point [-]

But you wouldn't choose to die rather than walk through the city, would you?

It's hard for me to take the nightmare science fiction scenarios too seriously when the default actions comes with a well established, nonfictional nightmare: you don't sign up for cryonics, you die, and that's the end.

Comment author: Alicorn 24 April 2009 03:27:13AM *  0 points [-]

I call it risk aversion because if cryonics works at all, it ups the stakes. The money dropped on signing up for it is a sure thing, so it doesn't factor into risk, and if I get frozen and just stay dead indefinitely (for whatever reason) then all I've lost compared to not signing up is that money and possibly some psychological closure for my loved ones. But the scenarios in which cryonics results in me being around for longer - possibly indefinitely - are ones which could be very extreme, in either direction. I'm not comfortable with such extreme stakes: I prefer everything I have to deal with to be within my finite lifespan, in the absence of having a near-certainty about a longer lifespan being awesome.

I don't doubt that there are some "nightmare" situations in which I'd prefer cryonics - I'd rather be frozen than spend the next seventy years being tortured, for example - but I don't live in one of those situations.

Comment author: Mulciber 24 April 2009 05:30:17AM 5 points [-]

That's starting to sound like a general argument for shorter lifetimes over longer ones. Is there a reason this wouldn't apply just as well to living for five more years versus fifty? There's more room for extreme positive or negative experiences in the extra 45 years.

Comment author: AndySimpson 24 April 2009 04:22:36AM 1 point [-]

I really do think we're all getting too worked up over the minutia of the karma system.

Agreed, but:

This isn't a game.

We must admit that to a great extent, it is. We are all attempting to make ourselves appear more useful to the community, and karma is the only quantitative way to tell if we're making progress. Like so many things, it feels like it trivializes but it is there for a purpose.

Comment author: Mulciber 24 April 2009 05:27:23AM 0 points [-]

We are all attempting to make ourselves appear more useful to the community

That gets to the heart of why I don't think the karma system is worth too much emphasis. Shouldn't we instead be attempting to make ourselves more useful to the community?

Like so many things, it feels like it trivializes but it is there for a purpose.

That's true. I do think we're better off with it than we would be without it, but it shouldn't get attention disproportionate to its purpose. It's a means to an end, nothing more.

Comment author: Alicorn 24 April 2009 12:51:19AM *  0 points [-]

I'm obviously not being very clear. I'm not making a case that it's irrational to sign up for cryonics - I'm just saying it's not appropriate for someone with a very high risk-aversion, such as myself. I'm informed by the same person who taught me about levels of risk aversion in the first place that no given level of risk aversion is necessarily irrational or irrational; it's just a personal characteristic. It's quite possible that by making these choices you'll be around, enjoying a great quality of life, in four thousand years, and I won't. That would be awesome for you and less awesome for me. I'm just not willing to take the bet.

Comment author: Mulciber 24 April 2009 02:36:43AM 2 points [-]

Describing this as being averse to risks doesn't make much sense to me. Couldn't a pro-cryonics person equally well justify her decision as being motivated by risk aversion? By choosing not to be preserved in the event of death, you risk missing out on futures that are worth living in. If you want to take this into bizarre and unlikely science fiction ideas, as with your dystopian cannon fodder speculation, you could easily construct nightmare scenarios where cryonics is the better choice. Simply declaring yourself to have "high risk aversion" doesn't really support one side over the other here.

This reminds me of a similar trope concerning wills: someone could avoid even thinking about setting up a will, because that would be "tempting fate," or take the opposite position: that not having a will is tempting fate, and makes it dramatically more likely that you'll get hit by a bus the next day. Of course, neither side there is very reasonable.

View more: Next