Comment author: RichardKennaway 23 December 2015 10:49:00AM 4 points [-]

Revivees wake up with the memories they went to sleep with, but a great many of them have a growing conviction that they are the wrong person. For some this "dysidentity disorder" is acute and distressing, for others, merely a curiosity that they live with. All seem to have it to some extent. Nobody knows why.

Comment author: RowanE 29 December 2015 08:49:10PM 0 points [-]

Would seem to imply memories don't make up who you are - I mean, what I'm inclined to read into it is "there are souls and they got moved around", but it could be anything - in which case, if there's a way to cause myself amnesia (and with this level tech why wouldn't there be?) I should just wipe out my memories and find out who I am. Ideally it'll also be possible to save the memories in backups somehow, or I'll have "external memory" like diaries and such, in case I start regretting the decision.

Comment author: Kaj_Sotala 24 December 2015 12:36:53PM *  3 points [-]

I've seen people consider the Warren Ellis take plausible. Excerpt:

Looking at her new charity-donated clothes, still bearing the ammonia spoor of the man who wore them last, Mary's shocked brain started to a new understanding.

She wasn't wanted here.

She was Revived out of a sense of begrudged duty. She'd been foisted upon a future already busy enough with its own problems by a past that couldn't have cared less.

She could have told the future what it'd been like to meet Che Guevara in that old Cuban schoolhouse. She could've told them about the last Queen and Albert Einstein and a million other true stories besides.

But the future didn't want to know. It honored the contracts with the past; revived them, gave them their money back (even adjusted the sum in their favor against revaluation and inflation), gave them the Hostels.

Put them away with a new, unspoken contract: Don't bother us. We're not interested.

Comment author: RowanE 29 December 2015 08:44:39PM 1 point [-]

That scenario still sounds awesome, as long as I'm comparing it to "no cryonics" instead of "best-case cryonics scenarios". I get to be dropped into a completely unfamiliar world with just my mind, a small sum of money, and a young healthy body? Sounds like a fun challenge, I mean I died once what have I got to lose?

Comment author: devas 23 December 2015 01:15:37PM 6 points [-]

You are one of the first to be revived.

The technique is imperfect, and causes you massive neurological damage (think late stage Alzheimer's), trapping you in a nonverbal yet incredibly painful and horrifying state.

Due to advances in gerontology, you have a nearly infinite lifespan ahead of you, cognizant only of what you have lost.

When neuroscience finally advances to the point where you can be fixed, it's still not yet advanced enough to give you back your memories.

You're effectively a completely different person, and you know that.

Comment author: RowanE 29 December 2015 08:06:53PM *  0 points [-]

Well, my current self and associated memories/opinions is fine with the second part, this is basically just a Buddhist hell where afterwards I get reincarnated into the post-singularity future.

ETA: also highly unlikely, since it happening to me is conditional on the scenario happening to anyone.

Comment author: qmotus 20 October 2015 08:24:12AM 0 points [-]

So did I understand correctly, believing in big world immortality doesn't cause you an existential crisis, but not believing in it does?

Comment author: RowanE 22 October 2015 10:11:01PM 0 points [-]

Yes - I mean existential crisis in the sense of dread and terror from letting my mind dwell on my eventual death, convincing myself I'm immortal is a decisive solution to that insofar as I can actually convince myself. I don't mind existence being meaningless, it is that either way, I care much more about whether it ends.

Comment author: qmotus 19 October 2015 08:17:13AM *  7 points [-]

It's often entertained on LessWrong that if we live in some sort of a big world, then conscious observers will necessarily be immortal in a subjective sense. The most familiar form of this idea is quantum immortality in the context of MWI, but arguably a similar sort of what I would call 'big world immortality' is also implied if, for example, we live in another sort of multiverse or in a simulation.

It seems to me that big world scenarios are well accepted here, but that a lot of people don't take big world immortality very seriously. This confuses me, and I wonder if I'm missing something. I suppose that there are good counterarguments that I haven't come across or that haven't actually been presented yet because people haven't spent that much time thinking about stuff like this. The ones I have read are from Max Tegmark, who's stated that he doesn't believe quantum immortality to be true because death is a gradual, not a binary process, and (in Our Mathematical Universe) because he doesn't expect the necessary infinities to actually occur in nature. I'm not sure how credible I find these.

So, should we take big world immortality seriously? I'd appreciate any input, as this has been bothering me quite a bit as of late and had a rather detrimental effect on my life. Note that I'm not exactly very thrilled about this; to me, this kind of involuntary immortality, that nevertheless doesn't guarantee that anyone else will survive from an observers point of view, sounds pretty horrible. David Lewis presented a very pessimistic scenario in 'How Many Lives Has Schrödinger's Cat' as well.

Comment author: RowanE 20 October 2015 05:40:49AM 2 points [-]

I consciously will myself to believe in big world immortality, as a response to existential crises, although I don't seem to have actual reasons not to believe such besides intuitions about consciousness/the self that I've seen debated enough to distrust.

Comment author: turchin 03 October 2015 07:46:17PM 0 points [-]

Basically we do it all the time when we communicate with a person and create his model in our head. I think that it is moral to return to life everybody, except whose who explicitly and rationally were against.

We don't know how to solve identity problem now, but maybe we will do some kind practical research and we will find it in the future. Or may be AI will help us. Until that I suggest conservative approach to identity - try to preserve as much as possible and accept copy creation only if alternative is death.

May be we could build mechanism of identity transfer which is independent from information. If identity has any substance, like soul or causal links, we could build machines that find it and preserve it.

There is also two type of immortality. Immortality for me, that is immortality from the point of view of the observer, which is most interesting, but also immortality-for-others, thats is immortality for your friends. Big world immortality from the link may work, but only for immortality-азк-me, but not for my friends who may want to see me alive in 20 years here on Earth.

Also big world immortality helps cryonics and DI because resurrected DI and cryo client will dominate big-world resurection landscape and some of these resurrections will be exact as originals. So big world immortality help to fill gap lost during cryo

Comment author: RowanE 03 October 2015 09:31:25PM 0 points [-]

Storing data that might be used to reconstruct someone in the future isn't really objectionable, but that seems separate from actually using that data to create the resurrection. And it probably works out fine in the utilitarian calculus unless you count the sunk cost vs creating a "better" new person or a utility monster, but bringing someone back to life just because they didn't mention that they didn't want it, or you thought the reason they gave for not wanting it was irrational, sounds really skeevy. We have rules about consent for interacting with other people's bodies, I think that includes implanting their consciousness in new bodies.

Comment author: skeptical_lurker 28 September 2015 08:36:22PM 3 points [-]

I've heard some people worry that even sufficiently good vibrators and fleshlights will reduce the amount of actual sex people have. Making a robot that a normal person can fall in love with might be AGI-complete (although some people do get attached to robot pets or form imaginary relationships with anime characters or 'waifu') but even a robot which isn't as good as a human still decreases the need for human company.

Also, is the creation of sex robots a matter of AI, or of creating realistic synthetic skin? If its the latter, and AGI turns out to be really hard, then sex robots could easly come a long time before the singularity.

Comment author: RowanE 03 October 2015 08:04:05PM 0 points [-]

I believe the accepted plural of "waifu" is "waifus".

Comment author: RowanE 03 October 2015 01:18:21PM 1 point [-]

I know at least in our specific community, that we'd rather be resurrected than not, and especially in a techno-utopian future, almost goes without saying, but it still worries me that you don't seem to mention consent. At least the top paragraph suggests a third party collecting information about someone else so that they can be resurrected after their death, and even if we skip over the more normal issues with doing that, resurrecting someone without their permission seems like a violation.

In the mix with the problems you've listed under 1. is whether this kind of resurrection is even necessary. Personally, I doubt those identity problems can be conclusively solved even in principle, at least to a level that people's intuitions don't dominate, although I'm inclined to give up on what's actually factual there and try to convince myself of the weakest notion of identity I can find believable. I can't do much pushing there, but the notion I default to using based on my intuitions (sleeping doesn't kill you, uploading does kill you) is hard to justify so I don't mind trying to push away from it. Should really be stepping up my DI efforts.

Comment author: RowanE 16 July 2015 10:30:35AM 2 points [-]

I think the word "reasonable" is used enough as an applause light rather than an actual descriptor that it should probably be put in "scare quotes" to defuse it through most of this essay.

In response to comment by RowanE on Crazy Ideas Thread
Comment author: polymathwannabe 09 July 2015 09:04:31PM 1 point [-]

Right now, yours is the first comment people see in this thread. Put yourself in the readers' shoes.

Comment author: RowanE 09 July 2015 09:22:22PM 0 points [-]

Well, I'll admit I only read the title before diving into the comments, and in that context it's sufficiently obvious. Although I'm twisted enough that adding descriptions like "ideas that spontaneously come to mind (and feel great)" just makes it funnier.

View more: Prev | Next