Manfred comments on The Ellsberg paradox and money pumps - Less Wrong

10 Post author: fool 28 January 2012 05:34PM

You are viewing a comment permalink. View the original post to see all comments and the full post content.

Comments (72)

You are viewing a single comment's thread. Show more comments above.

Comment author: Manfred 31 January 2012 08:13:37PM *  0 points [-]

I mean that I could offer you $9 on green for 2.50, $9 on blue for 2.50, and $9 on red for 3.01, and you wouldn't take any of those bets, despite, in total, having a certainty of making 99 cents. This "type 2" dutch book argument (not really a dutch book, but it's showing a similar thing for the same reasons) is based on the principle that if you're passing up free money, you're doing something wrong :P

Comment author: fool 31 January 2012 10:14:28PM 1 point [-]

I wouldn't take any of them individually, but I would take green and blue together. Why would you take the red bet in this case?

Comment author: Manfred 01 February 2012 12:16:54AM 0 points [-]

I intentionally designed the bets so that your agent would take none of them individually, but that together they would be free money. If it has a correct belief, naturally a bet you won't take might look a little odd. But to an agent that honestly thinks P(green | buying) = 2/9, the green and blue bets will look just as odd.

And yes, your agent would take a bet about (green or blue). That is beside the point, since I merely first offered a bet about green, and then a bet about blue.

Comment author: fool 01 February 2012 12:55:11AM 3 points [-]

You mean, I will be offered a bet on green, but I may or may not be offered a bet on blue? Then that's not a Dutch book -- what if I'm not offered the bet on blue?

For example: suppose you think a pair of boots is worth $30. Someone offers you a left boot for $14.50. You probably won't find a right boot, so you refuse. The next day someone offers you a right boot for $14.50, but it's too late to go back and buy the left. So you refuse. Did you just leave $1 on the table?

Comment author: Manfred 01 February 2012 01:33:45AM 0 points [-]

Ah, I see what you mean now. So, through no fault of your own, I have conspired to put the wrong boots in front of you. It's not about the probability depending on whether you're buying or selling the bet, it's about assigning an extra value to known proportions.

Of course, then you run in to the Allais paradox... although I forget whether there was a dutch book corresponding to the Allais paradox or not.

Comment author: fool 01 February 2012 03:00:59AM 0 points [-]

I do not run into the Allais paradox -- and in general, when all probabilties are given, I satisfy the expected utility hypothesis.

Comment author: Manfred 01 February 2012 03:23:19AM 0 points [-]

Not running into the Allais paradox means that if you dump an undetermined ball into a pool of balls, you just add the bets together linearly. But, of course, you do that enough times and you just have the normal result.

So yeah, I'm pretty sure Allais paradox.

Comment author: fool 01 February 2012 05:59:59AM 0 points [-]

No, this doesn't sound like the Allais paradox. The Allais paradox has all probabiliies given. The Ellsberg paradox is the one with the "undetermined balls". Or maybe you have something else entirely in mind.

Comment author: Manfred 01 February 2012 06:04:24AM *  0 points [-]

What I mean is possible preference reversal if you just have a probability of a gamble vs. a known gamble.

Comment author: fool 31 January 2012 10:10:59PM *  0 points [-]

I wouldn't take any of them individually (except red), but I'd take all of them together. Why is that not allowed?