AndrewWilcox comments on Open Thread: January 2010 - Less Wrong

5 Post author: Kaj_Sotala 01 January 2010 05:02PM

You are viewing a comment permalink. View the original post to see all comments and the full post content.

Comments (725)

You are viewing a single comment's thread. Show more comments above.

Comment author: AndrewWilcox 08 January 2010 01:43:44AM *  1 point [-]

Hmm, I wonder if you could leave instructions, kind of like a living will except in reverse, so to speak... e.g., "only unfreeze me if you know I'll be able to make good friends and will be happy". Perhaps with a bit more detail explaining what "good friends" and "being happy" means to you :-)

If I were in charge of defrosting people, I'd certainly respect their wishes to the best of my ability.

And, if your life does turn out to be miserable, you can, um, always commit suicide then... you don't have to commit passive suicide now just in case... :-)

But it certainly is a huge leap in the dark, isn't it? With most decisions, we have some idea of the possible outcomes and a sense of likelihoods...

Comment author: Alicorn 08 January 2010 01:45:20AM 0 points [-]

Why would they be in a position to know that I'd be able to make good friends and be happy?

Comment author: SoullessAutomaton 08 January 2010 02:52:04AM 1 point [-]

Well, if everyone else they've revived so far has ended up a miserable outcast in an alien society, or some other consistent outcome, they might be able to take a guess at it.

Comment author: Alicorn 08 January 2010 03:00:40AM 0 points [-]

Bit of a gap between "not a miserable outcast in an alien society" and "has good close friends".

Comment author: AndrewWilcox 08 January 2010 03:34:48AM 0 points [-]

I can think of three possibilities...

If I'm in charge of unfreezing people, and I'm intelligent enough, it becomes a simple statistical analysis. I look at the totality of historical information available about the past life of frozen people: forum posts, blog postings, emails, youtube videos... and find out what correlates with the happiness or unhappiness of people who have been unfrozen. Then the decision depends what confidence level you're looking for: do you want to be unfrozen if there's a 80% chance that you'll be happy? 90%? 95%? 99%? 99.9%?

Two, I might not be intelligent enough, or there might not be enough data available, or we might not be finding useful statistical correlates. Then if your instructions are to not unfreeze you if we don't know, we don't unfreeze you.

Three, I might be incompetent or mistaken so that I unfreeze you even if there isn't any good evidence that you're going to be happy with your new situation.