Comment author: scmbradley 02 February 2012 01:56:59PM 0 points [-]

P6 entails that there are (uncountably) infinitely many events. It is at least compatible with modern physics that the world is fundamentally discrete both spatially and temporally. The visible universe is bounded. So it may be that there are only finitely many possible configurations of the universe. It's a big number sure, but if it's finite, then Savage's theorem is irrelevant. It doesn't tell us anything about what to believe in our world. This is perhaps a silly point, and there's probably a nearby theorem that works for "appropriately large finite worlds", but still. I don't think you can just uncritically say "surely the world is thus and so".

If this is supposed to say something normative about how I should structure my beliefs, then the structural premises should be true of the world I have beliefs about.

Comment author: fool 03 February 2012 01:35:23AM 1 point [-]

I don't think you can just uncritically say "surely the world is thus and so".

But it was a conditional statement. If the universe is discrete and finite, then obviously there are no immortal agents either.

Basically I don't see that aspect of P6 as more problematic than the unbounded resource assumption. And when we question that assumption, we'll be questioning a lot more than P6.

Comment author: Manfred 01 February 2012 03:23:19AM 0 points [-]

Not running into the Allais paradox means that if you dump an undetermined ball into a pool of balls, you just add the bets together linearly. But, of course, you do that enough times and you just have the normal result.

So yeah, I'm pretty sure Allais paradox.

Comment author: fool 01 February 2012 05:59:59AM 0 points [-]

No, this doesn't sound like the Allais paradox. The Allais paradox has all probabiliies given. The Ellsberg paradox is the one with the "undetermined balls". Or maybe you have something else entirely in mind.

Comment author: Manfred 01 February 2012 01:33:45AM 0 points [-]

Ah, I see what you mean now. So, through no fault of your own, I have conspired to put the wrong boots in front of you. It's not about the probability depending on whether you're buying or selling the bet, it's about assigning an extra value to known proportions.

Of course, then you run in to the Allais paradox... although I forget whether there was a dutch book corresponding to the Allais paradox or not.

Comment author: fool 01 February 2012 03:00:59AM 0 points [-]

I do not run into the Allais paradox -- and in general, when all probabilties are given, I satisfy the expected utility hypothesis.

Comment author: Will_Sawin 31 January 2012 11:22:13PM 0 points [-]

How do you choose the interval? I have not been able to see any method other than choosing something that sounds good (choosing the minimum and maximum conceivable would lead to silly Pascal's Wager - type things, and probably total paralysis.)

The discontinuity: Suppose you are asked to put a fair price f(N) on a bet that returns N if A occurs and 1 if it does not. The function f will have a sharp bend at 1, equivalent to a discontinuity in the derivative.

An alternative ambiguity aversion function, more complicated to define, would give a smooth bend.

Comment author: fool 01 February 2012 01:15:05AM 0 points [-]

How do you choose the interval? I have not been able to see any method other than choosing something that sounds good

Heh. I'm the one being accused of huffing priors? :-)

Okay, granted, there are methods like maximum entropy for Bayesian priors that can be applied in some situations, and the Ellsberg urn is such a situation.

Yes, you are correct about the discontinuity in the derivative.

Comment author: Manfred 01 February 2012 12:16:54AM 0 points [-]

I intentionally designed the bets so that your agent would take none of them individually, but that together they would be free money. If it has a correct belief, naturally a bet you won't take might look a little odd. But to an agent that honestly thinks P(green | buying) = 2/9, the green and blue bets will look just as odd.

And yes, your agent would take a bet about (green or blue). That is beside the point, since I merely first offered a bet about green, and then a bet about blue.

Comment author: fool 01 February 2012 12:55:11AM 3 points [-]

You mean, I will be offered a bet on green, but I may or may not be offered a bet on blue? Then that's not a Dutch book -- what if I'm not offered the bet on blue?

For example: suppose you think a pair of boots is worth $30. Someone offers you a left boot for $14.50. You probably won't find a right boot, so you refuse. The next day someone offers you a right boot for $14.50, but it's too late to go back and buy the left. So you refuse. Did you just leave $1 on the table?

Comment author: Manfred 31 January 2012 08:13:37PM *  0 points [-]

I mean that I could offer you $9 on green for 2.50, $9 on blue for 2.50, and $9 on red for 3.01, and you wouldn't take any of those bets, despite, in total, having a certainty of making 99 cents. This "type 2" dutch book argument (not really a dutch book, but it's showing a similar thing for the same reasons) is based on the principle that if you're passing up free money, you're doing something wrong :P

Comment author: fool 31 January 2012 10:14:28PM 1 point [-]

I wouldn't take any of them individually, but I would take green and blue together. Why would you take the red bet in this case?

Comment author: Manfred 31 January 2012 08:13:37PM *  0 points [-]

I mean that I could offer you $9 on green for 2.50, $9 on blue for 2.50, and $9 on red for 3.01, and you wouldn't take any of those bets, despite, in total, having a certainty of making 99 cents. This "type 2" dutch book argument (not really a dutch book, but it's showing a similar thing for the same reasons) is based on the principle that if you're passing up free money, you're doing something wrong :P

Comment author: fool 31 January 2012 10:10:59PM *  0 points [-]

I wouldn't take any of them individually (except red), but I'd take all of them together. Why is that not allowed?

Comment author: Will_Sawin 31 January 2012 03:10:11AM 0 points [-]

Yes. So basically you are biting a certain bullet that most of us are unwilling to bite, of not having a procedure to determine your decisions and just kind of choosing a number in the middle of your range of choices that seems reasonable.

You're also biting a bullet where you have a certain kind of discontinuity in your preferences with very small bets, I think.

Comment author: fool 31 January 2012 07:30:48PM 0 points [-]

I don't understand what you mean in the first paragraph. I've given an exact procedure for my decisions.

What kind of discontinuities to you have in mind?

Comment author: Manfred 31 January 2012 02:11:31AM 0 points [-]

Ah, I see. But now you'll get type 2 dutch booked - you'll pass up on certain money if someone offers you a winning bet that requires you to buy.

Comment author: fool 31 January 2012 07:27:59PM *  0 points [-]

I guess you mean: you offer me a bet on green for $2.50 and a bet on blue for $2.50, and I'd refuse either. But I'd take both, which would be a bet on green-or-blue for $5. So no, no dutch book here either.

Or do you have something else in mind?

Comment author: Manfred 30 January 2012 10:56:45PM *  0 points [-]

I'm confused what your notation means. Let's drop the asymmetry for now and just focus on the fact that you appear to be violating the laws of probability. Does your (1/2 +- 1/6) notation mean that if I would give you a dollar if you drew a green ball, you would be willing to pay 1/3 of a dollar for that bet (bet 1)? Ditto for red (bet 2)? But then if you paid me a dollar if the ball came up (green-or-red), you would be willing to accept 1/2 of a dollar for that bet (bet 3)?

In that case, the dutch book consists of bets like (bet 1) + (bet 2) + (bet 3): you pay me 1/3, you pay me 1/3, I pay you 1/2 (so you paid me 1/6th of a dollar total). Then if the ball's green I pay you a dollar, if it's red I pay you a dollar, and if it's (green-or-red) you pay me a dollar.

Comment author: fool 30 January 2012 11:04:50PM *  1 point [-]

If the bet pays $273 if I drew a red ball, I'd buy or sell that bet for $93. For green, I'd buy that bet for $60 and sell it for $120. For red-or-green, I would buy that for $153 and sell it for $213. Same for blue and red-or-blue. For green-or-blue, I'd buy or sell that for $180.

(Appendix A has an exact specification, and you may wish to (re-)read the boot dialogue.)

[ADDED: sorry, I missed "let's drop the asymmetry" .. then, if the bet pays $9 on red, buy or sell for $3; green, buy $2 sell $4; red-or-green, buy $5 sell $7; blue, red-or-blue same, green-or-blue, buy or sell $6. Assuming risk neutrality for $, etc etc no purchase necessary must be over 18 void in Quebec.]

View more: Prev | Next