Comment author: Kyre 04 December 2015 04:55:42AM *  1 point [-]

Here is a second Simulation Trilemma.

If we are living in a simulation, at least one of the following is true:

1) we are running on a computer with unbounded computational resources, or

2) we will not launch more than one simulation similar to our world, or

3) the simulation we are in will terminate shortly after we launch our own simulations.

Here 'short' is on the order of the period between the era we start the simulation at and when the simulation reaches our stage.

Comment author: woodchopper 27 April 2016 03:55:23PM 0 points [-]

Why would us launching a simulation use more processing power? It seems more likely that the universe does a set amount of information processing and all we are doing is manipulating that in constructive ways. Running a computer doesn't process more information than the wind blowing against a tree does; in fact, it processes far less.

Comment author: Kyre 26 April 2016 05:42:13AM *  0 points [-]

Heh, that was really just me trying to come up with a justification for shoe-horning a theory of identity into a graph formalism so that Konig's Lemma applied :-)

If I were to try to make a more serious argument it would go something like this.

Defining identity, whether two entities are 'the same person' is hard. People have different intuitions. But most people would say that 'your mind now' and 'your mind a few moments later' are do constitute the same person. So we can define a directed graph with verticies as mind states (mind states would probably have been better than 'observer moments') with outgoing edges leading to mind states a few moments later.

That is kind of what I meant by "moment-by-moment" identity. By itself it is a local but not global definition of identity. The transitive closure of that relation gives you a global definition of identity. I haven't thought about whether its a good one.

In the ordinary course of events these graphs aren't very interesting, they're just chains coming to a halt upon death. But if you were to clone a mind-state and put it into two different environments, they that would give you a vertex with out-degree greater than one.

So mind-uploading would not break such a thing, and in fact without being able to clone a mind-state, the whole graph-based model is not very interesting.

Also, you could have two mind states that lead to the same successor mind state - for example where two different mind states only differ on a few memories, which are then forgotten. The possibility of splitting and merging gives you a general (directed) graph structured identity.

(On a side-note, I think generally people treat splitting and merging of mind states in a way that is way too symmetrical. Splitting seems far easier - trivial once you can digitize a mind-state. Merging would be like a complex software version control problem, and you'd need very carefully apply selective amnesia to achieve it.)

So, if we say "immortality" is having an identity graph with an infinite number of mind-states all connected through the "moment-by-moment identity" relation (stay with me here), and mind states only have a finite number of successor states, then there must be at least one infinite path, and therefore "eternal existence in linear time".

Rather contrived, I know.

Comment author: woodchopper 27 April 2016 03:41:44PM 1 point [-]

So, the graph model of identity sort of works, but I feel it doesn't quite get to the real meat of identity. I think the key is in how two vertices of the identity graph are linked and what it means for them to be linked. Because I don't think the premise that a person is the same person they were a few moments ago is necessarily justified, and in some situations it doesn't meld with intuition. For example, a person's brain is a complex machine; imagine it were (using some extremely advanced technology) modified seriously while a person was still conscious. So, it's being modified all the time as one learns new information, has new experiences, takes new substances, etc, but let's imagine it was very dramatically modified. So much so that over the course of a few minutes, one person who once had the personality and memories of, say, you, ended up having the rough personality and memories of Barack Obama. Could it really be said that it's still the same identity?

Why is an uploaded mind necessarily linked by an edge to the original mind? If the uploaded mind is less than perfect (and it probably will be; even if it's off by one neuron, one bit, one atom) and you can still link that with an edge to the original mind, what's to say you couldn't link a very, very dodgy 'clone' mind, like for example the mind of a completely different human, via an edge, to the original mind/vertex?

Some other notes: firstly, an exact clone of a mind is the same mind. This pretty much makes sense. So you can get away from issues like 'if I clone your mind, but then torture the clone, do you feel it?' Well, if you've modified the state of the cloned mind by torturing it, it can no longer be said to be the same mind, and we would both presumably agree that me cloning your mind in a far away world and then torturing the clone does not make you experience anything.

Comment author: qmotus 25 April 2016 05:17:05PM 0 points [-]
  1. Maybe; it would probably think so, at least if it wasn't told otherwise.

  2. Both would probably think so.

  3. All three might think so.

  4. I find that a bit scary.

  5. Wouldn't there, then, be some copies of me not being tortured and one that is being tortured?

Comment author: woodchopper 27 April 2016 01:30:58PM 0 points [-]

Wouldn't there, then, be some copies of me not being tortured and one that is being tortured?

If I copied your brain right now, but left you alive, and tortured the copy, you would not feel any pain (I assume). I could even torture it secretly and you would be none the wiser.

So go back to the scenario - you're killed, there are some exact copies made of your brain and some inexact copies. It has been shown that it is possible to torture an exact copy of your brain while not torturing 'you', so surely you could torture one or all of these reconstructed brains and you would have no reason to fear?

Comment author: qmotus 25 April 2016 11:27:36AM *  0 points [-]

Let's suppose that the contents of a brain are uploaded to a computer, or that a person is anesthesized and a single atom in their brain is replaced. What exactly would it mean to say that personal identity doesn't persist in such situations?

Comment author: woodchopper 25 April 2016 02:59:17PM 0 points [-]

So, let's say you die, but a super intelligence reconstructs your brain (using new atoms, but almost exactly to specification), but misplaces a couple of atoms. Is that 'you'?

If it is, let's say the computer then realises what it did wrong and reconstructs your brain again (leaving its first prototype intact), this time exactly. Which one is 'you'?

Let's say the second one is 'you', and the first one isn't. What happens when the computer reconstructs yet another exact copy of your brain?

If the computer told you it was going to torture the slightly-wrong copy of you (the one with a few atoms missing), would that scare you?

What if it was going to torture the exact copy of you, but only one of the exact copies? There's a version of you not being tortured, what's to say that won't be the real 'you'?

In response to Roughly you
Comment author: woodchopper 25 April 2016 11:13:56AM 0 points [-]

Why would something that is not atom to atom exactly what you are now be 'you'?

Comment author: qmotus 25 April 2016 10:18:41AM 0 points [-]

If there's no objective right answer, you can just decide for yourself. If you want immortality and decide that a simulation of 'you' is not actually 'you', I guess you ('you'?) will indeed need to find a way to extend your biological life. If you're happy with just the simulation existing, then maybe brain uploading or FAI is the way to go. But we're not going to "find out" the right answer to those questions if there is no right answer.

But I think the concept of personal identity is inextricably linked to the question of how separate consciousnesses, each feeling their own qualia, can arise.

Are you talking about the hard problem of consciousness? I'm mostly with Daniel Dennett here and think that the hard problem probably doesn't actually exist (but I wouldn't say that I'm absolutely certain about this), but if you think that the hard problem needs to be solved, then I guess this identity business also becomes somewhat more problematic.

Comment author: woodchopper 25 April 2016 10:48:07AM 0 points [-]

I think consciousness arises from physical processes (as Denett says), but that's not really solving the problem or proving it doesn't exist.

Anyway, I think you are right in that if you think being mind-uploaded does or does not constitute continuing your personal identity or whatever, it's hard to say you are wrong. However, what if I don't actually know if it does, yet I want to be immortal? Then we have to study that to figure out what things we can do keep the real 'us' existing and what don't.

What if the persistence of personal identity is a meaningless pursuit?

Comment author: qmotus 24 April 2016 06:15:42PM 0 points [-]

Isn't it purely a matter of definition? You can say that a version of you with one atom of yourself is you or that it isn't; or that a simulation of you either is or isn't you; but there's no objective right answer. It is worth nothing, though, that if you don't tell the different-by-one-atom version, or the simulated version, of the fact, they would probably never question being you.

Comment author: woodchopper 25 April 2016 07:35:49AM 0 points [-]

If there's no objective right answer, then what does it mean to seek immortality? For example, if we found out that a simulation of 'you' is not actually 'you', would seeking immortality mean we can't upload our minds to machines and have to somehow figure out a way to keep the pink fleshy stuff that is our current brains around?

If we found out that there's a new 'you' every time you go to sleep and wake up, wouldn't it make sense to abandon the quest for immortality as we already die every night?

(Note, I don't actually think this happens. But I think the concept of personal identity is inextricably linked to the question of how separate consciousnesses, each feeling their own qualia, can arise.)

Comment author: Kyre 20 April 2016 04:46:08AM *  1 point [-]

If we take "immortality" to mean "infinitely many distinct observer moments that are connect to me through moment-to-moment identity", then yes, by Konig's Lemma.

(Every infinite graph with finite-degree verticies has an infinite path)

(edit: hmmm, does many-worlds give you infinite-branching into distinct observer moments ?)

Comment author: woodchopper 24 April 2016 05:38:30PM 0 points [-]

Can you elaborate on the concept of a connection through "moment-to-moment identity"? Would for example "mind uploading" break such a thing?

Comment author: turchin 23 April 2016 01:14:34PM 0 points [-]

It is good question. The problem of personal identity is one of most complex, like aging. I am working on the map of identity solutions, and it is very large.

If the decide that identity has definition I, the death os abrupt disappearance of I. And immortality is idea that death never happens. It seems that this definition of immortality doesn't depends of definition of identity.

But practically the more fragile is identity, the more probable is death.

Comment author: woodchopper 24 April 2016 05:31:58PM 0 points [-]

The thing is, I'm just not sure if it's even a reasonable thing to talk about 'immortality' because I don't know what it means for one personal identity ('soul') to persist. I couldn't be sure if a computer simulated my mind it would be 'me', for example. Immortality will likely involve serious changes to the physical form our mind takes, and once you start talking about that you get into the realm of thought experiments like the idea that if you put someone under a general anaesthetic, take out one atom from their brain, then wake them up, you have a similar person but not the one who originally went under the anaesthetic. So from the perspective of the original person, undergoing their operation was pointless, because they are dead anyway. The person who wakes from the operation is someone else entirely.

I guess I'm just trying to say that immortality makes heaps of sense if we can somehow solve the question of personal identity, but if we can't, then 'immortality' may be pretty nonsensical to talk about, simply because if we cannot say what it takes for a single 'soul' to persist over time, the very concept of 'immortality' may be ill-defined.

I like your post about the heat death of the universe, if you ever figure anything out regarding the persistence of a personal identity, I'd like you to message me or something.

Comment author: Yosarian2 02 July 2015 02:20:21PM 1 point [-]

I think the "Use surviving particles for ever slower calculations" is probably the most likely solution, assuming an empty universe/ heat death scenario. It was shown, I believe, that based on the expected rate of the expansion of the univese, a thinking being could have an subjectively infinite long period of time that way.

The converse is also possible; in a "big crunch" scenario, you would have a finite period of time, but the amount of energy available in any given volume of space would increase at an accelerating rate and approach infinity, so a being would (in theory) be able to think more and more quickly as the amount of energy available increases, and you could also experience an infinite amount of subjective time within an objectively finite time period.

(Of course, a "big crunch" seems very unlikely now, based on what we know of dark energy.)

Comment author: woodchopper 23 April 2016 12:01:42PM *  0 points [-]

Currently it's pretty commonly believed that the end state of the universe is decayed particles moving away from every other particle at faster than the speed of light, therefore existing in an eternal and inescapable void. If you only have one particle you can't do calculations.

View more: Prev | Next