This appears to be a broken adaptation. You never get any evidence about what the coin came up as, which is a key part of the original problem.
You might imagine playing the game with 9 copies of yourself, or with 9 perfect utility maximizers. This gets rid of the coordination problem by symmetry.
This appears to be a broken adaptation. You never get any evidence about what the coin came up as, which is a key part of the original problem.
It's not clear to me that you do get any evidence about what the coin came up as in the original problem. That is, in both this formulation and the previous formulation, there is the exact same amount of information transfer from the poser of the deal to the taker of the deal.
Imagine someone has a bag in it with 10 balls in it. Just one of them is marked with your name. They flip a coin. If heads they pull nine balls from the bag, if tails they pull one ball.
Sounds familiar.
Now, suppose you are told that your ball was pulled; you are called to be a judge. P(heads|called) = P(heads) × P(called|heads) / ( P(called|heads)×P(heads) + P(called|tails)×P(tails) ), applying Bayes' theorem and completeness. Is this updated probability different than the prior of 1/2? Well, we need to know that P(called|heads) and P(called|tails) are. Since the problem is non-perverse, they're the nice round 9/10 and 1/10.
So P(called|heads) / ( P(called|heads)×P(heads) + P(called|tails)×P(tails) ), the "likelihood ratio" that's basically how much evidence you get (there's a log in there somewhere) evaluates to 9/10 / (9/20 + 1/20 ) = 9/5. Therefore being called as judge gives you some evidence about whether the coin landed heads or tails.
Ok. You're offered a bet that is only valid if your ball is selected. That is enough to bring updating into the situation- you don't even need to know whether or not your ball will be selected! You say "Ok, there's a half chance my ball is not selected, and the bet is off. The other half of the time, the bet is on, and there's a 9/10ths chance that the coin came up heads, since I know my ball has been selected."
As suggested earlier, this strategy only works if you throw away half of the outcome space under the assumption that you can't impact what happens there, despite the formulation of the problem being such that you do impact what happens there.
Bit-level reasoning suggests you should flip all bits, as each bit impacts the total result in 8/16 cases, and in seven of those cases the coin came up tails. 7/8$28+1/8$4=$25>$21.
What you call 'bit level reasoning' just seems like bad reasoning to me. After the flip (but before I am told) I am given the option of switching, a decision which would give me
So no, I'm not swapping. Your 'bitwise' transition involves arbitrarily assigning too much weight to the 'tails' possibility.
Your 'bitwise' transition involves arbitrarily assigning too much weight to the 'tails' possibility.
Do you agree that is also the case in the previous formulation of the problem, after you have been told you are a decider?
Do you agree that is also the case in the previous formulation of the problem, after you have been told you are a decider?
I don't see a problem with either formulation. Just the solution. But it does seem to be the same mistake made with the proposed 'yea' solution in the previous formulation. In this case, however, the mistake appears even more obvious. So I could understand people making a bad decision on the previous formulation but a better decision this time. (If they switch here but stay 'nay' on the previous one then I was my hands of them and let their flawed thinking remain opaque.)
In this case, however, the mistake appears even more obvious.
I agree. The challenge is articulating why it's a bad idea, rather than just recognizing it as such, and having an articulation that survives the transition back to the other formulation.
Or is the wrinkle in it solely that the individual analysis stumbles when it comes to dealing with coordinated action?
Bit-level reasoning suggests you should flip all bits, as each bit impacts the total result in 8/16 cases, and in seven of those cases the coin came up tails. 7/8*$28+1/8*$4=$25>$21.
Why are we comparing to $21, the pre-flip byte level value? The bit-level "choose 1" calculation is implicitly adding the contributions of each impactful bit times the probability the bit is impactful, or 7*(7/8*$4) + 1*(1/8*$4) = $25. Similarly let's "choose 0" as 7*(7/8*$X) + 1*(1/8*$21).
The only way we're getting that calculation to equal $21, the byte-level calculation, is if $X is $3. As if each of the 7 bits is contributing 1/7 of the $21 reward. But how in the world can we justify setting $X to $3 in bit-level reasoning, taking into account the other bits, when we don't take the other bits into account when calculating the probability?
I'm not sure I see what you're suggesting instead, though I will point out I believe there is an error in that section- the point of this problem is to figure out what is wrong with the calculation suggesting you should defect.
I calculated the expected values for bytes with the bit-level analysis- that is, the bit-level analysis has done the "okay, you're a decider" updating but is still dealing with numbers on the scale of bytes. So $21 comes from 1/8 * $21 + 7/8 * $21=$21.
The justification for that is, if you sent in a byte of identical bits, all bits know that they are clones and so whatever they decide, all the others will decide as well.
Previous formulation here. (There's a link to the original formulation from there.)
I showed a related problem to someone and got back the objection "well, that's a coordination problem- you need game theory to model the other players, and so you can't simply declare a strategy correct." At first I thought that was an evasion, and so reformulated it so you make all the decisions. It seems to me that this problem is isomorphic to the previous formulation, except switching from nay to yay is more clearly ridiculous (if you disagree, I'd like to hear it!). Here it is:
To simplify calculations, assume you are risk-neutral with regards to dollars at this scale. You provide me a byte of 8 bits, all zeros or ones. I flip a fair coin: if it lands heads, I select one bit from the byte you provided me uniformly at random; if it lands tails, I select seven bits from the byte you provided me uniformly at random.
If all bits selected are 0s, I pay you $21. For every selected bit that is a 1, I pay you $4.
You do some byte-level calculations before I flip the coin, and decide that the byte 00000000 is best, because it has an expected value of $21/2+$21/2=$21. The coin is flipped, and then without telling you the flip I give you the option to flip every bit that I will select (or you imagine this, and submit the opposite byte instead). Bit-level reasoning suggests you should flip all bits, as each bit impacts the total result in 8/16 cases, and in seven of those cases the coin came up tails. 7/8*$28+1/8*$4=$25>$21. Byte-level reasoning suggests you shouldn't flip all the bits, because the byte 11111111 has an expected value of $4/2+$28/2=$16<$21. This is the conflict in the formulation linked above- before you know you're a decider, you think "nay" is superior,but once you know you're a decider, you calculate that "yay" is superior.
But this still has a coordination problem- the $21 payoff requires every bit to be a 0. What happens when we get rid of that?
Now, if I get heads, I pay out $21 for a 0 and $4 for a 1. If I get tails, I pay out $3 for each 0 and $4 for each one. E[00000000]=21, E[11111111]=16 (as before), but now E[0]=21*1/8+3*7/8=5.25 and E[1]=4*1/8+4*7/8=4. Now, by precisely the same amount as the byte-level analysis, E[0]>E[1]! Indeed, this appears to be general.1
Is there a way to formulate an Uncoordinated Psy-Kosh's Non-Anthropic Decision Problem? (UPKNADP for short.) Or is the wrinkle in it solely that the individual analysis stumbles when it comes to dealing with coordinated action? Note the issue isn't modeling other players- you're making the moves for all players in the game, and can model yourself perfectly. The issue is how you count the rewards associated with coordinated action.
1.You provide me with n bits. I flip a fair coin: if it lands heads, I select one bit uniformly at random and pay out a for every 0; if it lands tails, I select n-1 bits uniformly at random and pay out b/n-1 for every 1. I can always add or subtract a constant amount to each prospect without changing the strategy for a risk-neutral player, meaning I can simplify the payout matrix down to just a and b.
The byte-level2 calculations suggest that E[n 0s]=a*1/2+(n-1)*0*1/2=a/2 and E[n 1s]=0*1/2+(n-1)b/2(n-1)=b/2. Your best strategy is to pick 0s if a>b and 1s if b>a (and you're indifferent if they're the same).
The bit-level calculations suggest that E[0s]=a*1/n+0*(n-1)/n=a/n. E[1s]=0/n+b/(n-1)*(n-1)/n=b/n. Again, the payoff ratio is the exact same- you should pick 0s if a>b and 1s if b>a (and you're indifferent if they're the same).
The byte-level and bit-level calculations agree, for all values of a and b and all n>1.
2. I suppose I shouldn't use "byte" to refer to a set of n bits, but I'd rather have my cake and eat it too by both using byte and having a general set of n bits.