Luke_A_Somers comments on Can you recognize a random generator? - Less Wrong

2 Post author: uzalud 28 December 2011 01:59PM

You are viewing a comment permalink. View the original post to see all comments and the full post content.

Comments (55)

You are viewing a single comment's thread. Show more comments above.

Comment author: Luke_A_Somers 28 December 2011 06:34:46PM 0 points [-]

There are many forms of Bayesianism, and I've only seen a few that are married to the notion that ALL uncertainty is due to ignorance and none due to nondeterminism.

QM, incidentally, even in MWI, is nondeterministic in the sense that you don't know which of the outcomes you personally will experience.

Comment author: wedrifid 28 December 2011 07:11:15PM 1 point [-]

QM, incidentally, even in MWI, is nondeterministic in the sense that you don't know which of the outcomes you personally will experience.

Yes I do. All of them. What I cannot predict is what my observation will be when it is determined by a quantum event that has already occurred but with which I have not yet had any interaction. That's no more deterministic than a 'for loop' in a computer - self reflective code before the loop can predict exactly what is going to happen in the future but code within the loop has to do a lookup of the counter variable (or a side effect) if it is going to work conditionally.

Comment author: shminux 28 December 2011 08:33:38PM 2 points [-]

Yes I do. All of them.

That's not a testable prediction, or a useful one.

Comment author: pragmatist 28 December 2011 11:24:37PM -2 points [-]

That's not a testable prediction, or a useful one.

It is in fact a testable prediction.

Comment author: shminux 28 December 2011 11:50:04PM 0 points [-]

I cannot find anything in that entry that suggests that experiencing all possible outcomes can be experimentally tested. Feel free to elaborate.

Comment author: pragmatist 29 December 2011 01:53:21AM 1 point [-]

Sorry, I should have elaborated, but I was short on time when I wrote the comment.

Let's say you set up a sequence of quantum experiments, each of which has a 90% chance (according to the Born probabilities) of killing you instantly and a 10% chance of leaving you unharmed. After a number of such experiments you find yourself alive. This is something you would expect if some form of MWI were true and if all surviving future selves had conscious experience continuous with yours. It is not something you would expect if a collapse interpretation were true, or if MWI combined with some sort of indeterminism (governed by Born's rule, presumably) about which future self continues your conscious experience were true. So such a sequence of experiments should lead you to update in favor of MWI + experience all possible outcomes.

Comment author: shminux 29 December 2011 02:18:51AM *  1 point [-]

Sorry, I am having trouble taking quantum suicide/immortality seriously. How is this different from The Simple Truth:

Inspector Darwin looks at the two arguers, both apparently unwilling to give up their positions. “Listen,” Darwin says, more kindly now, “I have a simple notion for resolving your dispute. You say,” says Darwin, pointing to Mark, “that people’s beliefs alter their personal realities. And you fervently believe,” his finger swivels to point at Autrey, “that Mark’s beliefs can’t alter reality. So let Mark believe really hard that he can fly, and then step off a cliff. Mark shall see himself fly away like a bird, and Autrey shall see him plummet down and go splat, and you shall both be happy.”

If there is even a remote chance that Mark would fly, he probably flew in almost every universe he survived.

Now, suppose one really dedicated and overzealous grad student of Tegmark performs this experiment. The odds of the MWI being a good model might go up significantly enough for others to try to replicate it in the tiny subset of the universes where she survives. As a result, in a tiny minority of the universes Max gets a Nobel prize for this major discovery, whereas in most others he gets sued by the family of the deceased.

If EY believed in this kind of MWI, he would not bother with existential risks, since humanity will surely survive in some of the branches.

Comment author: pragmatist 29 December 2011 02:32:17AM *  1 point [-]

Now, suppose one really dedicated and overzealous grad student of Tegmark performs this experiment. The odds of the MWI being a good model might go up significantly enough for others to try to replicate it in the tiny subset of the universes where she survives. As a result, in a tiny minority of the universes Max gets a Nobel prize for this major discovery, whereas in most others he gets sued by the family of the deceased.

I'm not suggesting that this is a scientific experiment that should be conducted. Nor was I suggesting you should believe in this form of MWI. I was merely responding to your claim that wedrifid's position is untestable.

Also, note that a proposition does not have to meet scientific standards of interpersonal testability in order to be testable. If I conducted a sequence of experiments that could kill me with high probability and remained alive, I would become pretty convinced that some form of MWI is right, but I would not expect my survival to convince you of this. After all, most other people in our branch who conducted this experiment would be dead. From your perspective, my survival could be an entirely expected fluke.

If EY believed in this kind of MWI, he would not bother with existential risks, since humanity will surely survive in some of the branches.

I'm fairly sure EY believes that humanity will survive in some branch with non-zero amplitude. I don't see why it follows that one should not bother with existential risks. Presumably Eliezer wants to maximize the wave-function mass associated with humanity surviving.

Comment author: shminux 29 December 2011 05:11:54AM 0 points [-]

If I conducted a sequence of experiments that could kill me with high probability and remained alive, I would become pretty convinced that some form of MWI is right, but I would not expect my survival to convince you of this.

Probably, but I'm having trouble thinking of this experiment as scientifically useful if you cannot convince anyone else of your findings. Maybe there is a way to gather some statistics from so called "miracle survival stories" and see if there is an excess that can be attributed to the MWI, but I doubt that there is such excess to begin with.

Presumably Eliezer wants to maximize the wave-function mass associated with humanity surviving.

Why? The only ones that matter are those where he survives.

Comment author: wedrifid 30 December 2011 03:50:55AM *  1 point [-]

Presumably Eliezer wants to maximize the wave-function mass associated with humanity surviving.

Why? The only ones that matter are those where he survives.

If if he doesn't care at all about anyone else at all. This doesn't seem likely.

Comment author: pragmatist 30 December 2011 03:09:48AM *  0 points [-]

Why? The only ones that matter are those where he survives.

This seems like a pretty controversial ethical position. I disagree and I'm pretty sure Eliezer does as well. To analogize, I'm pretty confident that I won't be alive a thousand years from now, but I wouldn't be indifferent about actions that would lead to the extinction of all life at that time.

Comment author: ArisKatsaris 29 December 2011 05:22:21AM 0 points [-]

Why? The only ones that matter are those where he survives.

If they don't matter to you, that still doesn't necessitate that they don't matter to him. Each person's utility function may care about whatever it pleases.

Comment author: Vladimir_Nesov 29 December 2011 09:48:26AM 0 points [-]
Comment author: dlthomas 29 December 2011 12:17:16AM -2 points [-]

QM, incidentally, even in MWI, is nondeterministic in the sense that you don't know which of the outcomes you personally will experience.

This is broken because

which of the outcomes you personally will experience

is incoherent in the context of MWI. There is a "you" now, on this side of the event. There will be many people labeled "you", on the other side. There is no one person on the other side that corresponds to "you personally" while the event is something you can say "will" about - at that point, it's "did".

Comment author: Luke_A_Somers 03 January 2012 05:14:00PM *  0 points [-]

Congratulations! You have constructed an interpretation of what I said that doesn't make sense.

Why don't you go back and try doing it the other way?

Comment author: dlthomas 03 January 2012 05:41:24PM 0 points [-]

Which other way?