Um, so the " 'd " suggests that something has been affected by a noun.
In this case, the statement "every disputant is partly right and partly wrong" is affected by generalization. In that it is, er, a false generalization.
Um, so the " 'd " suggests that something has been affected by a noun.
In this case, the statement "every disputant is partly right and partly wrong" is affected by generalization. In that it is, er, a false generalization.
What do you mean "the statement is affected by a generalisation"? What does it mean for something to be "affected by a generalisation"? What does it mean for a statement to be "affected"?
The claim is a general one. Are general claims always false? I highly doubt that. That said, this generalisation might be false, but it seems like establishing that would require more than just pointing out that the claim is general.
The Will-to-wager assumption feels too strong for me. I would like, for instance, to be able to say "I will wager up to $0.30 on H, or up to $0.60 on ~H. Likewise, I will sell you a wager on H for $0.70 or more, and on ~H for $0.40 or more."
Well, this is sound betting strategy. As I say, you shouldn't take bets with 0 expected return unless you just enjoy gambling; it's a waste of your time. The question we need to answer is whether or not this principle can be given a more abstract or idealized interpretation that says something important about why Bayesianism is rational- the argument certainly doesn't prove that non-Bayesians are going to get bilked all the time.
I think this misses the point, somewhat. There are important norms on rational action that don't apply only in the abstract case of the perfect bayesian reasoner. For example, some kinds of nonprobabilistic "bid/ask" betting strategies can be Dutch-booked and some can't. So even if we don't have point-valued will-to-wager values, there are still sensible and not sensible ways to decide what bets to take.
If you weaken your will-to-wager assumption and effectively allow your agents to offer bid-ask spreads on bets (i'll buy bets on H for x, but sell them for y) then you get "Dutch book like" arguments that show that your beliefs conform to Dempster-Shafer belief functions, or Choquet capacities, depending on what other constraints you allow.
Or, if you allow that the world is non-classical – that the function that decides which propositions are true is not a classical logic valuation function – then you get similar results.
Other arguments for having probability theory be the right representation of belief include representation theorems of various kinds, Cox's theorem, going straight from qualitative probability orderings, gradational accuracy style arguments…
I think we should at least mention that there are other good arguments for why adopting the probability theory is a good idea. For example Cox's theorem.
This seems to be orthogonal to the current argument. The Dutch book argument says that your will-to-wager fair betting prices for dollar stakes had better conform to the axioms of probability. Cox's theorem says that your real-valued logic of plausible inference had better conform to the axioms of probability. So you need the extra step of saying that your betting behaviour should match up with your logic of plausible inference before the arguments support each other.
Hm. I'd been meaning to ask about this apparent circularity in the foundations for a bit, and now this tells me the answer is "we don't know yet".
(Specifically: VNM proves the analogue of the "will-to-wager" assumption, but of course it assumes our usual notion of probability. Meanwhile Dutch book argument proves that you need our usual notion of probability - assuming the notion of utility! I guess we can say these at least demonstrate the two are equivalent in some sense. :P )
Savage's representation theorem in Foundations of Statistics starts assuming neither. He just needs some axioms about preference over acts, some independence concepts and some pretty darn strong assumptions about the nature of events.
So it's possible to do it without assuming a utility scale or a probability function.
Do roses make for good soup? They make for good chocolate.
I've had rosewater flavoured ice cream.
I bet cabbage ice cream does not taste as nice.
Generalization'd.
Sorry I'm new. I don't understand. What do you mean?
That's fairly specific. Do you have a particular viewpoint on decision theory?
I have lots of particular views and some general views on decision theory. I picked on decision theory posts because it's something I know something about. I know less about some of the other things that crop up on this site…
it is clear that each party to this dispute – as to all that persist through long periods of time – is partly right and partly wrong
— Bertrand Russell History of Western Philosophy (from the introduction, again.)
The question that needs answering isn't "What bets do I take?" but "What is the justification for Bayesian epistemology?".
What the Dutch book theorem gives you are restrictions on the kinds of will-to-wager numbers you can exhibit and still avoid sure loss. It's a big leap to claim that these numbers perfectly reflect what your degrees of belief ought to be.
But that's not really what's at issue. The point I was making is that even among imperfect reasoners, there are better and worse ways to reason. We've sorted out the perfect case now. It's been done to death. Let's look at what kind of imperfect reasoning is best.