If Alice thinks X happens with a probability of 20% while Bob thinks it's 40%, what would be a fair bet between them?
I created a Claude Artifact, which calculates a bet such that the expected value is the same for both.
In this case, Bob wins if X happens (he thinks it's more likely). If Alice bets $100, he should bet $42.86, and the EV of such bet for both players (according to their beliefs) is $14.29.
EDIT: I updated the calculator to handle the case when A's probability is higher than B's correctly.
The assumption that “equal monetary EV” is the definition of “fair” is questionable. In fact, any wager between 21% and 39% (narrower if transaction costs and risk-of-ruin are included) is fair from the standpoint of “ask participants prefer to make the bet vs declining”.
If you do want to make it “fair” in terms of equal benefit to both, you probably need their utility-of-marginal-money calculations. If Alice really needs the money, it’s not “fair” for Bob to demand half of the monetary expectation.
There’s also the fairness question of whether they are equally rational and well calibrated and have the same relevant information (hint: Aumann proved they don’t).
Yes, fair here means that their subjective EVs are equal. The post referenced in the sibling comment calls it "Even Odds", which is probably better.
I wonder if you could take the R1-Zero training regime, penalize/restrict using existing words from all languages (maybe only in the scratchpad, not the final response), and obtain a model which can solve math problems by reasoning in a non-existent language.