I don't believe any of the various purely computational definitions of personhood and survival, so just preserving the shapes of neurons, etc., doesn't mean much to me. My best bet is that the self is a single physical thing, a specific physical phenomenon, which forms at a definite moment in the life of the organism, persists through time even during unconsciousness, and ceases to exist when its biological matrix becomes inhospitable. For example, it might be an intricate topological vortex that forms in a (completely hypothetical) condensate of phonons and/or biophotons, somewhere in the cortex.
That is just a wild speculation, made for the sake of concreteness. But what is really unlikely is that I am just a virtual machine, in the sense of computer science - a state machine whose states are coarse-grainings of the actual microphysical states, and which can survive to run on another, physically distinct computer, so long as it reproduces the rough causal structure of the original.
Physically, what is a computer? Nuclei and electrons. And physically, what is a computer program? It is an extreme abstraction of what some of those nuclei and electrons are doing. Computers are designed so that these abstractions remain valid - so that the dynamics of the virtual machine will match the dynamics of the physical object, unless something physically disruptive occurs.
The physical object is the reality, the virtual machine is just a concept. But the information-centric theory of what minds are and what persons are, is that they are virtual machines - a reification of a conceptual construct. This is false to the robust reality of consciousness, especially, which is why I insist on a theory of the self that is physical and not just computational.
I don't want to belabor this point, but just want to make clear again why I dissent from the hundred protean ideas out there, about mind uploading, copies, conscious simulations, platonic programs, personal resurrection from digital brain-maps, and so on, in favor of speculations about a physical self within the brain. Such a self would surely have unconscious coprocessors, other brain regions that would be more like virtual machines, functional adjuncts to the conscious part, such as the immediate suppliers of the boundary conditions which show up in experience as sensory perceptions. But you can't regard the whole of the mind as nothing but virtual machines. Some part of it has to be objectively real.
What would be the implications of this "physical" theory of identity, for cryonics? I will answer as if the topological vortex theory is the correct one, and not just a placeholder speculation.
The idea is that you begin to exist when the vortex begins to exist, and you end when it ends. By this criterion, the odds look bad for the proposition that survival through cryonics is possible. I could invent a further line of speculation as to how the web of quantum entanglement underlying the vortex is not destroyed by the freezing process, but rather gets locked into the ground state of the frozen brain; and such a thing is certainly thinkable, but that's all, and it is equally thinkable that the condensate hosting the vortex depends for its existence on a steady expenditure of energy provided by cellular metabolism, and must therefore disintegrate when the cells freeze. From this perspective cryonics looks like an unlikely gamble, a stab in the dark. So an advocate would have to revert to the old argument that even if the probability of survival through cryonics is close to zero, the probability of survival through non-cryonics is even closer to zero.
What about the idea of surviving by preserving your information? The vortex version of this concept is, OK, during this life you are a quantum vortex in your brain, and that vortex must cease to exist in a cryonically preserved brain; but in the future we can create a new vortex in a new brain, or in some other appropriate physical medium, and then we can seed it with information from the old brain. And thereby, you can live again - or perhaps just approximate-you, if only some of the information got through.
To say anything concrete here requires even more speculation. One might say that the nature of such resurrection schemes would depend a great deal on the extent to which the details of a person depend on information in the vortex, or on information in the virtual coprocessors of the vortex. Is the chief locus of memory, a virtual machine outside of and separate from the conscious part of the brain, coupled to consciousness so that memories just appear there as needed; or are there aspects of memory which are embedded in the vortex-self itself? To reproduce the latter would require, not just the recreation of memory banks adjoining the vortex-self, but the shaping and seeding of the inner dynamics of the vortex.
Either way, personally I find no appeal in the idea of "survival" via such construction of a future copy. I'm a particular "vortex" already; when that definitively sputters out, that's it for me. But I know many others feel differently, and such divergent attitudes might still exist, even if a vortex revolution in philosophy of mind replaced the program paradigm.
I somewhat regret the extremely speculative character of these remarks. They read as if I'm a vortex true believer. The point is to suggest what a future alternative to digital crypto-dualism might look like.
Subscribe to RSS Feed
= f037147d6e6c911a85753b9abdedda8d)
How much do you want to bet on the conjunction of yours?
Just for exercise, let's estimate the probability of the conjunction of my claims.
claim A: I think the idea of a single 'self' in the brain is provably untrue according to currently understood neuroscience. I do honestly think so, therefore P(A) as close to 1.0 as makes no difference. Whether I'm right is another matter.
claim B: I think a wildly speculative vague idea thrown into a discussion and then repeatedly disclaimed does little to clarify anything. P(B) approx 0.998 - I might change my mind before the day is out.
claim C: The thing I claim to think in claim B is in fact "usually" true. P(C) maybe 0.97 because I haven't really thought it through but I reckon a random sample of 20 instances of such would be unlikely to reveal 10 exceptions, defeating the "usually".
claim D: A running virtual machine is a physical process happening in a physical object. P(D) very close to 1, because I have no evidence of non-physical processes, and sticking close to the usual definition of a virtual machine, we definitely have never built and run a non-physical one.
claim E: You too are a physical process happening in a physical object. P(E) also close to 1. Never seen a non-physical person either, and if they exist, how do they type comments on lesswrong?
claim F: Nobody knows enough about the reality of consciousness to make legitimate claims that human minds are not information-processing physical processes. P(F) = 0.99. I'm pretty sure I'd have heard something if that problem had been so conclusively solved, but maybe they were disappeared by the CIA or it was announced last week and I've been busy or something.
P( A B C D E F) is approx 0.96.
The amount of money I'd bet would depend on the odds on offer.
I fear I may be being rude by actually answering the question you put to me instead of engaging with your intended point, whatever it was. Sorry if so.