wedrifid comments on Open Thread: July 2010 - Less Wrong
You are viewing a comment permalink. View the original post to see all comments and the full post content.
You are viewing a comment permalink. View the original post to see all comments and the full post content.
Comments (653)
I would not deviate too much from the prior (most would fail).
Are you saying that LW readers suck at applied rationality, or are you disagreeing with the idea that applied rationality can help prevent startup failure?
I would say that preventing startup failure requires a whole group of factors, not least of which is good fortune. It is hard for me to judge whether LW are more likely than other people who self select to start start ups to get it all right. I note, for example, that people starting a second startup do not tend to be all that much likely to be successful than on their first attempt!
Suppose we were to test it empirically and 9/10 startups fail on their first attempt. Then test again and 9/10 still fail on second attempt. That is not enough information to determine that a given person would fail 10 times in a row, because it could be that there is some number of failures <10 where you finally acquire enough skill to avoid failure on a more routine basis.
Given the fact that there's a whole world of information, strategies, and skills specific to founding startups, I would be surprised if an average member of a given group of startup founders fails x times out of y when x/y first attempts also fail.
So it would be relevant (especially if you are, say an angel investor) how low the percentage of failures can be brought to with multiple attempts by a given individual, and whether a given kind of education (such as reading Less Wrong sequences, or a quality such as self-selecting to read them) would predispose you to reducing that number of failures more rapidly and/or further in the long run.
Here is the relevant quote from Paul Graham's Why Hiring is Obsolete:
He also goes on to say how managers at forward-thinking companies that he talked to such as Yahoo, Amazon, Google, etc. would prefer to hire a failed startup genius over someone who worked a steady job for the same period of time. Essentially, if you don't need financial stability in the near future, your time spent working diligently and passionately on your own ideas trying to make them fit the marketplace is more valuable than time spent on a steady payroll.
In a LW vein, it's worth noting that selection and survivorship biases (as well as more general anthropic biases) means that the very existence of the equity risk premium is unclear even assuming that it ever existed.
(I note this because most people seem to take the premium for granted, but for long-term LW purposes, assuming the premium is dangerous. Cryonics' financial support is easier given the premium, for example, but if there is no premium and cryonics organizations invest as if there was and try to exploit it, that in itself becomes a not insignificant threat.)
The survivorship bias described by wikipedia is complete nonsense. Events that wipe out stock markets also wipe out bond markets and often wipe out banks. Usually when people talk about survivorship bias in this context, they mean that the people compiling the data are complete incompetents who only look at currently existing stocks.
If your interest is in the absolute return and not in the premium, then survivorship is a bias.
ETA: I think I was too harsh on the people that look at the wrong stocks. But too soft on wikipedia.