Another problem with quantum measure
Let's play around with the quantum measure some more. Specifically, let's posit a theory T that claims that the quantum measure of our universe is increasing - say by 50% each day. Why could this be happening? Well, here's a quasi-justification for it: imagine there are lots and lots of of universes, most of them in chaotic random states, jumping around to other chaotic random states, in accordance with the usual laws of quantum mechanics. Occasionally, one of them will partially tunnel, by chance, into the same state our universe is in - and then will evolve forwards in time exactly as our universe is. Over time, we'll accumulate an ever-growing measure.
That theory sounds pretty unlikely, no matter what feeble attempts are made to justify it. But T is observationally indistinguishable from our own universe, and has a non-zero probability of being true. It's the reverse of the (more likely) theory presented here, in which the quantum measure was being constantly diminished. And it's very bad news for theories that treat the quantum measure (squared) as akin to a probability, without ever renormalising. It implies that one must continually sacrifice for the long-term: any pleasure today is wasted, as that pleasure will be weighted so much more tomorrow, next week, next year, next century... A slight fleeting smile on the face of the last human is worth more than all the ecstasy of the previous trillions.
One solution to the "quantum measure is continually diminishing" problem was to note that as the measure of the universe diminished, it would eventually get so low that that any alternative, non-measure diminishing theory, not matter how initially unlikely, would predominate. But that solution is not available here - indeed, that argument runs in reverse, and makes the situation worse. No matter how initially unlikely the "quantum measure is continually increasing" theory is, eventually, the measure will become so high that it completely dominates all other theories.
Loading…
Subscribe to RSS Feed
= f037147d6e6c911a85753b9abdedda8d)
Comments (33)
Is it fair to think of this as related to Pascal's mugging? That problem derived disproportionate EV from "utilities grow faster than complexities" (so we had a utility growing faster than its probability was shrinking), and this one derives them from "if this hypothesis is true, all utilities grow over time" (so we have a utility growing while its probability remains fixed).
Yes, very fair indeed. And even correct! :-)
Because the total amount of measure is a free parameter in quantum mechanics that we normally just fix at 1, an increase in the total amount of measure is not merely observationally indistinguishable from things staying the same, but is actually the same. One can shift measure around, but then you couldn't go above 1.
We can posit that the total amount of measure in the universe(s) is arbitrarily high, given an arbitrary long amount of time for the measure to grow (the usual "unbelievably longer than the total age of the universe raised to the power of the number of particles in the universe, etc...")
And since quantum mechanics cares about relative measure of different observations, we could also posit that total measure is infinite.
We could not have an infinite pool of measure to draw on, because if total measure was infinite, then any finite pieces could not interact without breaking linearity.
And, again, just because you can double the total amount of measure in your representation, doesn't mean that this number is physically meaningful. If the number was arbitrary to begin with, there's no reason to assume that changing it is meaningful.
Can you explain?
But you're doubling the total amount of measure relative to the total measure of the rest of the universe, a change that is non-arbitrary for many decision theories.
Suppose I start with a big blob of measure in a boring universe, that is slowly turning into universes like ours. Linearity says that the the rate at which universes like ours appear is proportional to how big the big blob of measure is.
In fact, this is crucial to calling it "measure" rather than just "that number in quantum mechanics."
So if the rate of universes like our appearing is proportional to the size of the original blob, as we make the size of the original blob infinite, we also make the rate of universes like ours appearing infinite. We cannot have a finite number of universes like ours, but an infinite blob of measure turning into them - we can only have a proportionally smaller infinite amount of universes like ours. This requirement gives us back our old limitations about eventually running into a maximum.
So, pretending that this sort of thing has any significance, you would also expect some worlds to tunnel, but chance, into neighboring states, as might result from making different decisions. So, the argument for always sacrificing in favor of future gains falls down, most of the measure for world in which you get the future benefits of the sacrifice comes from quantum fluctuations, not the sacrifice itself, as both available worlds, where you make or don't make the sacrifice, accumulate measure from random tunneling, regardless of your choice. You should make your decision based on the amount of measure you actually affect, not the amount that happens to merge into the same state you might cause. (And the ever growing number of branches this theory says would be accumulating measure just shows further how ridiculous it is.)
Er... this isn't a serious theory of physics I've put forwards!
My critique of the physics was more of an aside. The main point was the critique of the decision theory, that under the assumptions of this non-serious theory of physics, most of the measure of the various outcomes are independent of your decisions, and you should only base you decisions on the small amount of measure you actually affect.
But whether that small amount is increasing in time or not is very relevant to your decision (depending on how your theory treats measure in the first place).
My point was that under your assumptions, the amount you affect does not increase in time at all, only the amount you do not affect increases.
?
Er no, you can still make choices that increase of decrease utility. It's simply that the measure of the consequences of these choices keeps on increasing.
Suppose you are in a world with measure M and are choosing between A and B, where A results in world WA which includes an immediate effect worth 4 utilons per measure, and B results in world WB which includes a later effect at time T worth 3 utililons per measure. Suppose further that under your not-serious theory, at time T, random quantum fluctuations have added measure 10M to the worlds WA and WB. So your choice between A and B is a choice to either add measure M to world WA or world WB, so that choice A results in WA immediately having measure M worth 4M utililons and later at time T, WA having measure 11M (0 utilons) while WB has measure 10M (worth 30M utilons) for a total of 34M utilons, while choice B results in WB immediately having measure M, (worth 0 utilons), and at time T WA having measure 10M (worth 0 utilons) and WB having measure 11M (worth 33M utilons), so you choose A for 34M instead of B for 33M utilons, for the same reasons that without the non-serious theory, you would choose A for 4M utilons instead of B for 3M utilons. Your non-serious theory should not impact your decisions because your decisions do not control which worlds it adds measure to.
I was envisaging utilons being "consumed" at the time they were added (say people eating chocolate bars). So choosing A would add 4M utilons, and choosing B would add 33M utilons.
My example is entirely compatible with this.
So the problem here is that you are not accounting for the fact that choosing A in the measure M world does not prevent the accumulation of measure 10M to world WB from quantum fluctuation. You get those 30M utilons whether you choose A or B, choosing A gets you an immediate 4M additional utilons, while choosing B gets you a deferred 3M utilons.
A and B could be logically incompatible worlds, not simply different branches of the multiverse.
Due to standard entropy arguments, I would say that the chance that the aforementioned theory is correct and as such the universe is increasing in measure is orders of magnitude less likely than the reverse, that the universe is shrinking in measure. Thus, when summing over all possible worlds, the theory you suggest has a much lower weight than its reverse, and so is vastly outweighed.
This still leaves the opposite problem you mention, but seeing that these problems are opposites, it makes sense that only one can be the real problem.
But exponential growth will make short work of orders of magnitude...
Clarification: The probability is orders of magnitude less. This is a difference more than maintained under exponential growth. Example: if p=0.1, q=0.01, then p^n=1/10^n, while q^n is 1/10^(2n). Thus for all n>0, p is at least 10 times q, and in fact is 10^n times q, a difference that rapidly grows as n grows. As you can see, far from making short work of it, exponential growth only broadens the gap.
What are analogs of p, q and n here?
It feels to me like you're assuming that P(the universe is increasing in measure) is a function of the universe's current measure, which seems odd. But if it's not, then (I believe Stuart's claim is) no matter how small the probability, an increasing universe eventually has enough value to make it a dominant hypothesis in terms of EV.
I am working on the assumption that we have a theory (of low probability) that posits that the universe is continually increasing its measure, rather than having an independent low probability of measure increase at every moment.
This is related to the doomsday argument. Once you accept the idea that you can use your existence as evidence, you can show that the total measure of consciousness in the universe must be finite, otherwise you would not be within a googolplex years of the big bang. This disproves the increasing quantum measure theory.