This post is stolen valor. We just ended a 20-year war against some of the most horrible people in the world in 2021. Anyone over age 20 who claims to long for righteous combat who didn't enlist during the Global War on Terrorism (or make a serious attempt and have it denied for reasons fully beyond their control) is a liar. As is almost anyone who makes this claim now and hasn't signed up to be a mercenary in Ukraine (special exception for someone who needs a valid US security clearance and would put it at risk by joining a foreign army). Policy should not cater to their fake preferences.
5% of world pop or GDP means China, India, and the US are the only countries used for the calculation in 12. Which seems questionable.
Also, 11-13 and maybe even 6 and 14 are lagging indicators that seem quite unhelpful in making before-the-fact predictions
The interesting measure would be absolute returns, not risk-adjusted.
Szilard got his key insight precisely by being pissed at Rutherford's hubris. https://blogs.scientificamerican.com/the-curious-wavefunction/leo-szilard-a-traffic-light-and-a-slice-of-nuclear-history/
Choosing the wrong reference class?
A version of this type of situation seems to cover a lot of career decisions made by sufficiently talented people. If you're a young Mark Zuckerberg, should you drop out of Harvard? Dropping out of college is a bad idea, on average. It's not quite as clear cut as OP suggested because you can't reliably replicate what Bill Gates did, but there may be strong indicators that startup founders, or some more specific subclass like startup founders experiencing x% monthly growth in recurring users, and not the average college dropout, is the reference class you should look out. And maybe an experienced startup or VC person could point one to an even better reference class that wouldn't occur to me.
Haha. I didn't really know what was a reasonable amount to save when I started because I had just gotten my first real job and really had no idea how expensive a lifestyle I might want in the future. But I knew I didn't want to be poor ever again. So I set a fairly arbitrary goal, spent a few more years living on the poverty-level income I had had before getting a good job so that I could save while it still had lots and lots of time to grow, and now it's done.
And from a stress/flexibility standpoint I think it was the right decision. I probably don't have to think about saving ever again, except for fun, so if I want to take a job that is funner but pays less, I have absolute freedom to do that.
And it turns out even having money I can live pretty cheap. There were only a few material things I hated about being poor. The constant stress over money was the real problem most of the time. I don't enjoy cooking and the food was boring when I couldn't afford restaurants, so I eat more takeout. And walking 5 miles bc the bus doesn't go where you want kinda sucks, so I take more cabs/ubers/lyfts.
I am far away from retirement so not at 30x yet. But assuming 7% real returns, my projected nest egg is about 108x my current living expenses around the age I want to retire. If something goes wrong before then I can always put more in. Compounding is fucking magic if you start it in your early 20s.
The concept of leverage is not complicated. How it affects volatility drag is, or at least seems so to me when I hear ppl explain it. There is a disconnect between how my bran conceptualizes the abstract percentages vs actually holding an asset.
So, the basic idea for an unleveraged investment is your geometric returns are lower than arithmetic returns because of volatility. E.g. if you have $100, gain 10% one period and lose 5% the next, the arithmetic average return is 2.5% per period, calculated as (10+(-5)/2 but you actually only have $104.5, a return of 2.25% per period, because you are losing 5% of a bigger number than you are gaining 10% on. Easy enough.
But let's say you leverage 2x. Assume no interest to keep it simple. Then this is 20% gain and 10% loss. You have $108. A bigger gain than in the above example, but not 2x as big. Or at least that's what I see articles online saying. But this doesn't make sense to me when I try to conceptualize it as actually holding an asset. Let's say I buy one share of the stock using my own money and one share using a loan. I hold exactly the two shares for the two periods regardless of what the price does, then sell them at the end and pay off the loan. My portfolio is 200, goes to 220 (10% gain), then goes to 209 (5% loss). Then I sell, pay off the loan, and I have $109, not $108. The problem comes if I am not allowed to have a loan too large compared to my assets and have to sell at a bad time. So if the 5% drop happens first, I have $190, of which 100 is borrowed. Have to sell $10 of stock to bring my loan to parity with my own investment. Then I have $180, of which 90 is borrowed, and can only make $18 when the market moves 10% up, instead of the 19 I'd have if I held on to everything. So then my return really is only 8% instead of 9%, because I was forced to maintain constant leverage ratio.
So among ETFs, investing on margin, and futures, which allows me to remain closest to the buy and hold strategy? Or do I face roughly the same constraint no matter what?
Increasing labor income vs investing is not 100% fungible but there are some tradeoffs, especially being self-employed. Any time I spend to learn or manage finance stuff is time I could have spent working. And at least in principle there should be opportunities to spend money to increase my income, but it's a lot more unpredictable--I could advertise, in a non-pandemic environment I could join associations or go to events where I might meet lucrative clients, I could hire lower paid staff and take on clients who it is not worthwhile for me personally to perform services for due to opportunity cost, I could perhaps trade current income for prestige in some aspects of work hoping it will raise my stature and bring more money later, etc.