Brilliand comments on Nash Equilibria and Schelling Points - Less Wrong

41 Post author: Yvain 29 June 2012 02:06AM

You are viewing a comment permalink. View the original post to see all comments and the full post content.

Comments (76)

You are viewing a single comment's thread. Show more comments above.

Comment author: Eliezer_Yudkowsky 29 June 2012 03:52:07PM 14 points [-]

It's amazing, the results people come up with when they don't use TDT (or some other formalism that doesn't defect in the Prisoner's Dilemma - though so far as I know, the concept of the Blackmail Equation is unique to TDT.)

(Because the base case of the pirate scenario is, essentially, the Ultimatum game, where the only reason the other person offers you $1 instead of $5 is that they model you as accepting a $1 offer, which is a very stupid answer to compute if it results in you getting only $1 - only someone who two-boxed on Newcomb's Problem would contemplate such a thing.)

Comment author: Pentashagon 29 June 2012 11:42:55PM -1 points [-]

I'll guess that in your analysis, given the base case of D and E's game being a tie vote on a (D=100, E=0) split, results in a (C=0, D=0, E=100) split for three pirates since E can blackmail C into giving up all the coins in exchange for staying alive? D may vote arbitrarily on a (C=0, D=100, E=0) split, so C must consider E to have the deciding vote.

If so, that means four pirates would yield (B=0, C=100, D=0, E=0) or (B=0, C=0, D=100, E=0) in a tie. E expects 100 coins in the three-pirate game and so wouldn't be a safe choice of blackmailer, but C and D expect zero coins in a three-pirate game so B could choose between them arbitrarily. B can't give fewer than 100 coins to either C or D because they will punish that behavior with a deciding vote for death, and B knows this. It's potentially unintuitive for C because C's expected value in a three-pirate game is 0 but if C commits to voting against B for anything less than 100 coins, and B knows this, then B is forced to give either 0 or 100 coins to C. The remaining coins must go to D.

In the case of five pirates C and D except more than zero coins on average if A dies because B may choose arbitrarily between C or D as blackmailer. B and E expect zero coins from the four-pirate game. A must maximize the chance that two or more pirates will vote for A's split. C and D have an expected value of 50 coins from the four-pirate game if they assume B will choose randomly, and so a (A=0, B=0, C=50, D=50, E=0) split is no better than B's expected offer for C and D and any fewer than 50 coins for C or D will certainly make them vote against A. I think A should offer (A=0, B=n, C=0, D=0, E=100-n) where n is mutually acceptable to B and E.

Because B and E have no relative advantage in a four-pirate game (both expect zero coins) they don't have leverage against each other in the five-pirate game. If B had a non-zero probability of being killed in a four-pirate game then A should offer E more coins than B at a ratio corresponding to that risk. As it is, I think B and E would accept a fair split of n=50, but I may be overlooking some potential for E to blackmail B.

Comment author: Brilliand 30 September 2015 10:08:02PM 0 points [-]

In every case of the pirates game, the decision-maker assigns one coin to every pirate an even number of steps away from himself, and the rest of the coins to himself (with more gold than pirates, anyway; things can get weird with large numbers of pirates). See the Wikipedia article Kawoomba linked to for an explanation of why.