It has been well over a year since I first read Permutation City and relating writings on the internet on Greg Egan's dust theory. It still haunts me. The theory has been discussed tangentially in this community, but I haven't found an article that directly addresses the rationality of Egan's own dismissal of the theory.
In the FAQ, Egan says things like:
I wrote the ending as a way of dramatising[sic] a dissatisfaction I had with the “pure” Dust Theory that I never could (and still haven't) made precise (see Q5): the universe we live in is more coherent than the Dust Theory demands, so there must be something else going on.
and:
I have yet to hear a convincing refutation of it on purely logical grounds...
However, I think the universe we live in provides strong empirical evidence against the “pure” Dust Theory, because it is far too orderly and obeys far simpler and more homogeneous physical laws than it would need to, merely in order to contain observers with an enduring sense of their own existence. If every arrangement of the dust that contained such observers was realised, then there would be billions of times more arrangements in which the observers were surrounded by chaotic events, than arrangements in which there were uniform physical laws.
Isn't this, along with so many other problems, a candidate for our sometime friend the anthropic principle? That is: only in a conscious configuration field which has memories of perceptions of an orderly universe is the dust theory controversial or doubted? In the vastly more numerous conscious configuration fields with memories of perceptions of a chaotic and disorderly universe lacking a rational way to support the observer the dust theory could be accepted a priori or at least be a favored theory.
It is fine to dismiss dust theory because it simply isn't very helpful and because it has no predictions, testable or otherwise. I suppose it is also fine never to question the nature of consciousness as the answers don't seem to lead anywhere helpful either; though the question of it will continue to vex some instances of these configuration states.
Objection 1: many difficulties (Dust theory being one) are avoided if you simply do not use the term 'subjective experience'. Don't try to define it. Don't assume something exists that should be called that.
What is the discussion of 'subjective experience' needed for? What is the problem with discarding the entire concept? (I'm aware there are some problems, but I'm interested in your take on it because I think most of them can be explained away.)
Objection 2, to your item 3: the mapping of a 'mental state' to the configuration of some physical system is purely a matter of interpretation. The problem here is that you ascribe to physical configurations, some properties that are normally reserved for causal sequences of physical states, i.e. outright simulations.
Suppose I have a model of your 'mentality' - that thing whose states are your mental states. Since it's embodied in a physical system, I can enumerate all possible mental states. Suppose there are countable many states (I don't know physics that well, but at the very least this is valid if you accept arbitrarily precise descriptions of the physical system as mapping to arbitrarily precise specifications of your corresponding mental state).
Now, I can write down a few (very large) numbers that map to some mental states of yours. Do you think my act of writing down these numbers literally brings into existence a subjective experience that did not otherwise exists?
(You may object because the actual numbers involved are so big they probably can't be literally written down even on a Universe-sized piece of paper. But I can use any physical system, not much more complex than the one I'm modelling (that's you), to encode the numbers. Pen and paper aren't privileged media.)
Suppose I find a number, or a series of numbers, that corresponds to a state of extreme suffering on your part. How many real-life resources would you invest to prevent me, an AI, from storing that number in my memory where no-one's looking? (If your answer is 'none at all', then what else does this theory make you do differently, ever?) Would you react differently if I stored a sequence of very similar numbers, which correspond to almost-indistinguishable successive states of a real brain? (And that's without taking into account the problems raised by rearranging the states in time.)
But that's not the real problem. Remember that the mapping of mental states to numbers is purely arbitrary: any ordering of the natural numbers will do. What makes a given number invoke a given mental state? Is it just my own mental intention in using it? What if I build a non-sentient AI proxy to do it for me? What if I proclaim that I use the number 1 to encode a state of suffering in my simulation of you - will you try to stop everyone in the universe from writing down '1'? Will you counter-proclaim that no, the number 1 actually encodes your state of supreme happiness?
Objection 3, to your C2: your logic is invalid. Compare: "somewhere in the universe are mental states which correspond to someone mentally identical to yourself experiencing eternal torture. Therefore you will experience eternal torture, starting a moment from now."
The problem is that you have not defined what 'you' means in this context. If there are many similar or identical states to "yours", embodied at different points in the universe, which one are "you"? If there are several identical ones that diverge, and some experience eternal torture and some experience eternal happiness, which one is then "you"? If somewhere there is a sequence of mental states that starts out with the same memories you (Jack) have right now, but its actual experiences are of being on Mars, then do you expect to be on Mars?
This would be the Subjective Dust Theory, except that it's wrong. It's empirically wrong: my experiences have been highly ordered in the past and so I expect them to be ordered in the future, and not to jump randomly around the universe just because there exist embodiments of every possible future state I might experience. Of course you could say I just happen to be a state that remembers an ordered past - that's the Boltzmann's Brain (timelessness) postulate - but you can't really conclude anything based on this, so I think it's better to assume our memories are real and we really live in an ordered universe.
It sounds like I should clarify that I don't actually endorse the argument. I'm just trying to make the argument explicit so that we can stop all the hand-waving.
I successfully referred to something with the phrase. I know I did because your response wasn't "Huh? What does that word mean?"... (read more)