kmccarty comments on Conditioning on Observers - Less Wrong

6 Post author: Jonathan_Lee 11 May 2010 05:15AM

You are viewing a comment permalink. View the original post to see all comments and the full post content.

Comments (118)

You are viewing a single comment's thread. Show more comments above.

Comment author: kmccarty 01 June 2010 05:45:52AM *  0 points [-]

Perhaps this is beating a dead horse, but here goes. Regarding your two variants:

1 Same as SSB except If heads, she is interviewed on Monday, and then the coin is turned over to tails and she is interviewed on Tuesday. There is amnesia and all of that. So, it's either the sequence (heads on Monday, tails on Tuesday) or (tails on Monday, tails on Tuesday). Each sequence has a 50% probability, and she should think of the days within a sequence as being equally likely. She's asked about the current state of the coin. She should answer P(H)=1/4.

I agree. When iterated indefinitely, the Markov chain transition matrix is:

[ 0 1 0 0 ]
[ 1/2 0 1/2 0 ]
[ 0 0 0 1 ]
[ 1/2 0 1/2 0 ]

acting on state vector [ H1 H2 T1 T2 ], where H,T are coin toss outcomes and 1,2 label Monday,Tuesday. This has probability eigenvector [ 1/4 1/4 1/4 1/4 ]; 3 out of 4 states show Tails (as opposed to the coin having been tossed Tails). By the way, we have unbiased sampling of the coin toss outcomes here.

If the Markov chain model isn't persuasive, the alternative calculation is to look at the branching probability diagram

[http://entity.users.sonic.net/img/lesswrong/sbv1tree.png (SB variant 1)]

and compute the expected frequencies of letters in the result strings at each leaf on Wednesdays. This is

0.5 * ( H + T ) + 0.5 * ( T + T ) = 0.5 * H + 1.5 * T.

2 Same as SSB except If heads, she is interviewed on Monday, and then the coin is flipped again and she is interviewed on Tuesday. There is amnesia and all of that. So, it's either the sequence (heads on Monday, tails on Tuesday), (heads on Monday, heads on Tuesday) or (tails on Monday, tails on Tuesday). The first 2 sequences have a 25% chance each and the last one has a 50% chance. When asked about the current state of the coin, she should say P(H)=3/8

I agree. Monday-Tuesday sequences occur with the following probabilities:

HH: 1/4
HT: 1/4
TT: 1/2

Also, the Markov chain model for the iterated process agrees:

[ 0 1/2 0 1/2 ]
[ 1/2 0 1/2 0 ]
[ 0 0 0 1 ]
[ 1/2 0 1/2 0 ]

acting on state vector [ H1 H2 T1 T2 ] gives probability eigenvector [ 1/4 1/8 1/4 3/8 ]

Alternatively, use the branching probability diagram

[http://entity.users.sonic.net/img/lesswrong/sbv2tree.png (SB variant 2)]

to compute expected frequencies of letters in the result strings,

0.25 * ( H + H ) + 0.25 * ( H + T ) + 0.5 * ( T + T ) = 0.75 * H + 1.25 * T

Because of the extra coin toss on Tuesday after Monday Heads, these are biased observations of coin tosses. (Are these credences?) But neither of these two variants is equivalent to Standard Sleeping Beauty or its iterated variants ISB and ICSB.

The 1/2 solution to SSB results from similar reasoning. 50% chance for the sequence (Monday and heads). 50% chance for the sequence (Monday and tails, Tuesday and tails). P(H)=1/2

(Sigh). I don't think your branching probability diagram is correct. I don't know what other reasoning you are using. This is the diagram I have for Standard Sleeping Beauty

[http://entity.users.sonic.net/img/lesswrong/ssbtree.png (Standard SB)]

And this is how I use it, using exactly the same method as in the two examples above. With probability 1/2 the process accumulates 2 Tails observations per week, and with probability 1/2 accumulates 1 Heads observation. The expected number of observations per week is 1.5, the expected number of Heads observations per week is 0.5, the expected number of Tails observations is 1 per week.

0.5 * ( H ) + 0.5 * ( T + T ) = 0.5 * H + 1.0 * T

Likewise when we record Monday/Tuesday observations per week instead of Heads/Tails, the expected number of Monday observations is 1, expected Tuesday observations 0.5, for a total of 1.5. But in both of your variants above, the expected number of Monday observations = expected number of Tuesday observations = 1.