A bit more explanation on what the Kelly Criterion is, for those who haven't seen it before: suppose you're making a long series of independent bets, one after another. They don't have to be IID, just independent. They key insight is that the long-run payoff will be the product of the payoff of each individual bet. So, from the central limit theorem, the logarithm of the long-run payoff will converge to the average logarithm of the individual payoffs times the number of bets.
This leads to a simple statement of the Kelly Criterion: to maximize long-run growth, maximize the expected logarithm of the return of each bet. It's quite general - all we need is multiplicative returns and some version of the central limit theorem.
(Almost) never go full Kelly.
Kelly betting, or betting full Kelly, is correct if all of the following are true:
Just to clarify, the first two points that followed are actually reasons you might want to be *more* risk-seeking than Kelly, no? At least as they're described in the "most situations" list:
Marginal utility is decreasing, but in practice falls off far less than geometrically.
Losing your entire bankroll would end the game, but that’s life. You’d live.
If your utility is linear in money, you should just bet it all every time. If it's somewhere between linear and logarithmic, you should do something in between Kelly and betting it all.
Being linear in utility is insufficient to make betting it all correct, you also need to be able to place bets of unlimited size (or not have future opportunities for advantage bets). Otherwise, even if your utility outside of the game is linear, inside of the game it is not.
And yes, some of these points are towards being *more* risk-loving than Kelly, at which point you consider throwing the rules out the window.
even if your utility outside of the game is linear, inside of the game it is not.
Are there any games where it's a wise idea to use the Kelly criterion even though your utility outside the game is linear?
Yes, if the game has many opportunities for betting, you should focus on the instrumental use of the money, which is via compounding, thus the instrumental value is geometric, and so you should use the Kelly criterion. In particular, if your edge is small (but can be repeated), the only way you can make a lot of money is by compounding, so you should use the Kelly criterion.
I think this comment is incorrect (in the stated generality). Here is a simple counterexample. Suppose you have a starting endowment of $1, and that you can bet any amount at 0.50001 probability of doubling your bet and 0.49999 probability of losing everything you bet. You can bet whatever amount of your money you want a total of n times. (If you lost everything in some round, we can think of this as you still being allowed to bet 0 in remaining future rounds.) The strategy that maximizes expected linear utility is the one where you bet everything every time.
It depends on n. If n is small, such as n=1, then you should bet a lot. In the limit of n large, you should use the Kelly criterion. The crossover is about n=10^5. Which is why I said that it depends on having many opportunities.
You can prove e.g. by (backwards) induction that you should bet everything every time. With the odds being p>0.5 and 1-p, if the expectation of whatever strategy you are using after n-1 steps is E, then the maximal expectation over all things you could do on the n'th step is p2E (you can see this by writing the expectation as a conditional sum over the outcomes after n-1 steps), which corresponds uniquely to the strategy where you bet everything in any situation on the n'th step. It then follows that the best you can do on the (n-1)th step is also to maximize the expectation after it, and the same argument gives that you should bet everything, and so on.
(Where did you get n=10^5 from? If it came from some computer computation, then I would wager that there was some overflow/numerical issues.)
If you have money x after n-1 steps, then betting a fraction f on the n'th step gives you expected money (1-f)x+f2px. Given p>0.5, this is maximized at f=1, i.e. betting everything, which gives the expectation 2px. So conditional on having money x after n-1 steps, to maximize expectation after n steps, you should bet everything. Letting X_i be the random variable that is the amount of money you have after i steps given your betting strategy. We have (one could also write down a continuous version of the same conditioning but it is a bit easier to read if we assume that the set of possible amounts of money after n-1 steps is discrete, which is what I did here). From this formula, it follows that for any given strategy up to step n-1, hence given values for P(X_{n-1}=x), the thing to do on step n that maximizes E[X_n] is the same as the thing to do that maximizes E[X_n|X_{n-1}=x] for each x. So to maximize E[X_n], you should bet everything on the n'th step. If you bet everything, then the above formula gives
To recap what we showed so far: we know that given any strategy for the first n-1 steps, the best thing to do on the last step gives E[X_n]=2pE[X_{n-1}]. It follows that the strategy with maximal E[X_n] is the one with maximal 2pE[X_{n-1}], or equivalently the one with maximal E[X_{n-1}].
Now repeat the same argument for step n-1 to conclude that one should bet everything on step n-1 to maximize the expectation after it, and so on.
Or maybe to state a few things a bit more clearly: we first showed that E[X_n|X_{n-1}=x]<=2px, with equality iff we bet everything on step n. Using this, note that
, with equality iff we bet everything on step n conditional on any value of X_{n-1}. So regardless of what you do for the first n-1 steps, what you should do on step n is to bet everything, and this gives you the expectation E[X_n]=2pE[X_{n-1}]. Then finish as before.
You start with with 10 bucks, I start with 10 bucks. You wager any amount up to a hundred times, each time doubling it 60% of the time and losing it 40% of the time, until one of us is bankrupt or you stop. If you wager it all, I have a 40% chance to win. If you wager one buck at a time, you win almost certainly.
If you wager one buck at a time, you win almost certainly.
But that isn't the Kelly criterion! Kelly would say I should open by betting two bucks.
In games of that form, it seems like you should be more-and-more careful as the amount of bets gets larger. The optimal strategy doesn't tend to Kelly in the limit.
EDIT: In fact my best opening bet is $0.64, leading to expected winnings of $19.561.
EDIT2: I reran my program with higher precision, and got the answer $0.58 instead. This concerned me so I reran again with infinite precision (rational numbers) and got that the best bet is $0.21. The expected utilities were very similar in each case, which explains the precision problems.
EDIT3: If you always use Kelly, the expected utility is only $18.866.
Does your program assume that the Kelly bet stays a fixed size, rather than changing?
Here's a program you can paste in your browser that finds the expected value from following Kelly in Gurkenglas' game (it finds EV to be 20)
https://pastebin.com/iTDK7jX6
(You can also fiddle with the first argument to experiment
to see some of the effects when 4 doesn't hold)
I believe you missed one of the rules of Gurkenglas' game, which was that there are at most 100 rounds. (Although it's possible I misunderstood what they were trying to say.)
If you assume that play continues until one of the players is bankrupt then in fact there are lots of winning strategies. In particular betting any constant proportion less than 38.9%. The Kelly criterion isn't unique among them.
My program doesn't assume anything about the strategy. It just works backwards from the last round and calculates the optimal bet and expected value for each possible amount of money you could have, on the basis of the expected values in the next round which it has already calculated. (Assuming each bet is a whole number of cents.)
I just discovered this now, Zvi. It's such a great heuristic!
I whipped up an interactive calculator version in Desmos for my own future reference, but others might find it useful too: https://www.desmos.com/calculator/pf74qjhzuk
I'm trying to think of real life situations where this is useful, but I don't think I am doing too good a job.
For people who are trying to use their bankroll to make money:
For people who are looking to gamble for the sake of intellectual growth, like OP says, tiny quantities should work. And even if you need to bet in larger quantities so that you "feel the pain", 1) these quantities are still probably going to be a small fraction of your overall bankroll, and 2) if you are gambling for the sake of intellectual growth, you are probably smart enough to get a job that pays a lot of money, and can thus replenish your bankroll easily.
Marginal utility is decreasing, but in practice falls off far less than geometrically.
I think this is only true if you're planning to give the money to charity or something. If you're just spending the money on yourself then I think marginal utility is literally zero after a certain point.
I think this doesn’t take into account two things:
For example, suppose I am a billionaire. I’ve got my mansions in every city, my helicopter, my private island, and a vault full of gold that I can dive into, Scrooge McDuck style. What more can I want?
In a word: intangibles. I’ve purchased joy, security, and freedom; now I want respect. I want people everywhere to love me, and my name to be synonymous with goodness and benevolence.
So I give to charity. But so does everyone, right? My friend/rival Bob—also a billionaire—has just given five hundred million dollars to save cute puppies in war-torn countries. TIME magazine man of the year! Me? I’m a footnote on page 40.
The more money I make, the more I can give. The more I can give (and/or spend on other things! there’s more than one way to reap the intangible benefits of fame, after all), the more I gain—in real benefit, to myself, personally.
My marginal utility of money, then, is far greater than zero.
Consider the parallel to the AI whose goal is to bring you coffee, so it takes over the world to make sure no one can stop it from bringing you coffee: The fact that one might need or want more money makes it nonzero.
The more serious issue here is something I call the Uncanny Valley of Money, which I hope to write about at some point soon, where you have to move from spending on yourself (at as little as 1:1, in some sense) to spending on everyone (at up to 7000000000:1, or even more if you count the future, in some sense), in order to actually make any progress even for yourself.
Epistemic Status: Reference Post / Introduction
The Kelly Criterion is a formula to determine how big one should wager on a given proposition when given the opportunity.
It is elegant, important and highly useful. When considering sizing wagers or investments, if you don’t understand Kelly, you don’t know how to think about the problem.
In almost every situation, reasonable attempts to use it will be somewhat wrong, but superior to ignoring the criterion.
What Is The Kelly Criterion?
The Kelly Criterion is defined as (from Wikipedia):
(A bankroll is the amount of money available for a gambling operation or series of wagers, and represents what you are trying to grow and preserve in such examples.)
For quick calculation, you can use this rule: bet such that you are trying to win a percentage of your bankroll equal to your percent edge. In the above case, you win 60% of the time and lose 40% on a 1:1 bet, so you on average make 20%, so try to win 20% of your bankroll by betting 20% of your bankroll.
Also worth remembering is if you bet twice the Kelly amount, on average the geometric size of your bankroll will not grow at all, and anything larger than that will on average cause it to shrink.
If you are trying to grow a bankroll that cannot be replenished, Kelly wagers are an upper bound on what you can ever reasonably wager, and 25%-50% of that amount is the sane range. You should be highly suspicious if you are considering wagering anything above half that amount.
(Almost) never go full Kelly.
Kelly betting, or betting full Kelly, is correct if all of the following are true:
At least seven of these eight things are almost never true.
In most situations:
There are two reasons to preserve one’s bankroll. A bankroll provides opportunity to get data and experience. One can use the bankroll to make money.
Executing real trades is necessary to get worthwhile data and experience. Tiny quantities work. A small bankroll with this goal must be preserved and variance minimized. Kelly is far too aggressive.
If your goal is profit, $0.01 isn’t much better than $0.00. You’ll need to double your stake seven times to even have a dollar. That will take a long time with ‘responsible’ wagering. The best thing you can do is bet it all long before things get that bad. If you lose, you can walk away. Stop wasting time.
Often you should do both simultaneously. Take a small amount and grow it. Success justifies putting the new larger amount at risk, failure justifies moving on. One can say that this can’t possibly be optimal, but it is simple, and psychologically beneficial, and a limit that is easy to justify to oneself and others. This is often more important.
The last reason, #8, is the most important reason to limit your size. If you often have less edge than you think, but still have some edge, reliably betting too much will often turn you from a winner into a loser. Whereas if you have more edge than you think, and you end up betting too little, that’s all right. You’re gonna be rich anyway.
For compactness, I’ll stop here for now.