Roughly you

4 JDR 21 April 2016 03:28PM

Since, like everyone, I generalise from single examples, I expect most people have some older relative or friend who they feel has added some wisdom to their life - some small pieces of information which seem to have pervasively wormed their way into more of their cognitive algorithms than you would expect, coloring and informing perceptions and decisions. For me, this would most be my grandfather. Over his now 92 years he has given me gems such as "always cut a pear before you peel it" (make quick checks of the value of success before committing to time consuming projects) and whenever someone says "that's never happened before", finishing their sentence with "said the old man when his donkey died" (just because something hasn't happened before doesn't mean it wasn't totally predictable).

Recently, though, I've been thinking about something else he has said, admittedly in mock seriousness: "If I lose my mind, you should take me out back and shoot me". We wouldn't, he wouldn't expect us to, but it's what he has said.

The reason I've been thinking of this darker quotation is that I've been spending a lot of time with people who have "lost their minds" in the way that he means. I am a medical student, and on a rotation in old age psychiatry, so have been talking to patients most of whom have some level of dementia, often layered with psychotic conditions such as intractable schizophrenia, some of whom increasingly can't remember their own pasts let alone their recent present. They can become fixed in untrue beliefs, their emotional become limited, or lose motivation to complete even simple tasks.

It can be scary. In some ways, such illness represents death by degrees. These people can remain happy and have a good quality of life, but it's certain that they are not entirely the people they once were. In fact, this is a question we have asked relatives when deciding whether someone is suffering from early dementia: "Overall, in the way she behaves, does this seem like your mother to you? Is this how your mother acts?". Sometimes, the answer is "No, it's like she is a different person", or "Only some of the time". It's a process of personality-approximation, blurring, abridging and changing the mind to create something not quite the same. What my grandfather fears is becoming a rough estimate of himself - though again, for some, that re-drawn person might be perfectly happy with who they are when they arrive.

Why is this of interest to LessWrong? I think it is because quite a few people here (me included) have at least thought about bidding to live forever using things like cryogenics and maybe brain-download. These things could work at some point; but what if they don't work perfectly? What if the people of the future can recover some of the information from a frozen brain, but not all of it? What if we had to miss off a few crucial memories, a few talents, maybe 60 points of IQ? Or even more subtle things - it's been written a few times that the entirety of who a person is in their brain, but that's probably not entirely true - the brain is influenced by the body, and aspects of your personality are probably influenced by how sensitive your adrenals are, the amount of fat you have, and even the community of bacteria in your intestines. Even a perfect neural computer-you wouldn't have these things; it would be subtle, but the created immortal agent wouldn't completely be you, as you are now. Somehow, though, missing my precise levels of testosterone would seem an acceptable compromise for the rest of my personality living forever, but missing the memory of my childhood, half my intelligence or my ability to change my opinion would leave me a lot less sure.

So here's the question I want to ask, to see what people think: If I offered you partial immortality - immortality for just part of you - how rough an approximation of "you" would you be willing to accept?

Comment author: VoiceOfRa 12 November 2015 06:18:25AM 2 points [-]

God gave humans free will. Yes, He commands people to act morally, but He doesn't compel people to do so.

Comment author: JDR 12 November 2015 12:29:51PM 2 points [-]

He doesn't compel people to do so.

The threat of severe punishment if one goes against the commands seems pretty similar to compulsion to me. If I commanded you to do something on pain of being thrown in an eternal pit of snakes, you could reasonably say I was forcing you to do it.

I would also be interested to know how C.S. Lewis separated righteous divine intervention from omnipotent busybodiness if anyone has the knowledge and a few minutes to save me from the terrible trials of actually looking up this myself!

Comment author: Lumifer 07 October 2015 05:10:47PM *  1 point [-]

If we could run the experiment so that

Most of my point is that you can not. Among I things, I change over time.

As a practical example, I drink beer. Various kinds of. My beer preferences do not converge over time. Instead, they wander over different styles, different hoppiness/maltiness/etc., even different breweries. I have no idea what kind of beer I will like in, say, a year, but it probably will be different from what I like now.

Showing that something works in a toy model does not show that the same thing works in actual reality.

Comment author: JDR 07 October 2015 05:58:09PM 0 points [-]

Sure, I totally agree with you - in real life, we can really put a person in exactly the same situation twice. If we could, this whole free will argument would be a lot easier to solve.

That said, I do think the toy models are useful. Pretending we can do this experiment gives an answer to the problem I've never managed to pick a hole in (and tbh getting other people's input on it is the hidden motivation for entering this discussion):

If we could let you choose a beer, then rewind the universe - including all particles, forces, and known and unknown elements of cognition anyone might postulate such as souls and deities back to their starting position - then let it go again, there are only really two things that could happen: *1) you choose the same beer because that's what the universe was leading up to or *2) you choose a different beer despite the fact that all parameters of the universe known and unknown are the same.

The first outcome would suggest determinism; the second randomness, or at least independence from all variables which we consider "self" such as personality, memory and perhaps souls and things, since they were all rewound with the universe. I'd be really interested to hear of any third option anyone can think of!

As you say, showing this in a toy model isn't the same as showing it in actual reality; but when the actual experiment is impossible, one is arguing about abstract concepts anyway, and one has a lot of difficulty imagining outcomes not encompassed in the model I'm not sure we can do much better.

Comment author: Lumifer 07 October 2015 03:53:00PM 1 point [-]

It's clear that if you put someone in very similar situations and ask them to make a choice, over time they will converge to making a certain choice a certain percentage of the time.

No, it's not clear at all. If ask me to make choices in similar situations, first I might humor you, then I'll get bored and start fucking around with the system, and then I'll get really bored and stop cooperating with you. There won't be much of a convergence over time.

The abstraction is not the territory.

Comment author: JDR 07 October 2015 04:56:54PM 1 point [-]

then I'll get bored and start fucking around with the system, and then I'll get really bored and stop cooperating with you. There won't be much of a convergence over time.

The problem with that model seems to be that as time goes on, the situation in which you are put in becomes increasingly dissimilar to the original one, just because of we've added memories of having had to make this choice x number of times before. If we could run the experiment so that you always felt like it was the first time you were in this situation, perhaps by putting the same kind of decision in different contexts and spreading them out over time and with various distractions, do you think you'd still deviate in the same way?

I know I'm going back from territory to less practical abstraction here, but I think this kind of difficult-to-collect data would be more revealing for this question.