This comment got 6+ responses, but none that actually attempted to answer the question. My goal of Socratically prompting contrarian thinking, without being explicitly contrarian myself, apparently failed. So here is my version:
- Most startups are gimmicky and derivative, even or especially the ones that get funded.
- Working for a startup is like buying a lottery ticket: a small chance of a big payoff. But since humans are by nature risk-averse, this is a bad strategy from a utility standpoint.
- Startups typically do not create new technology; instead they create new technology-dependent business models.
- Even if startups are a good idea in theory, currently they are massively overhyped, so on the margin people should be encouraged to avoid them.
- Early startup employees (not founders) don't make more than large company employees.
- The vast majority of value from startups comes from the top 1% of firms, like Facebook, Amazon, Google, Microsoft, and Apple. All of those firms were founded by young white males in their early 20s. VCs are driven by the goal of funding the next Facebook, and they know about the demographic skew, even if they don't talk about it. So if you don't fit the profile of a megahit founder, you probably won't get much attention from the VC world.
- There is a group of people (called VCs) whose livelihood depends on having a supply of bright young people who want to jump into the startup world. These people act as professional activists in favor of startup culture. This would be fine, except there is no countervailing force of professional critics. This creates a bias in our collective evaluation of the culture.
Subscribe to RSS Feed
= f037147d6e6c911a85753b9abdedda8d)
the Tesla auto-driver accident was truly an accident. I didn't realize it was a semi crossing the divider and two lanes to hit him.
https://www.teslamotors.com/blog/misfortune
Here (copy) is a diagram.
Tesla's algorithm is supposed to be autonomous for freeways, not for highways with intersections, like this. The algorithm doing what it was supposed to do would not have prevented a crash. But the algorithm was supposed to eventually apply the brakes. Its failure to do so was a real failure of the algorithm. The driver also erred in failing to brake, probably because he was inappropriately relying on the algorithm. Maybe this was a difficult situation and he could not be expected to prevent a crash, but his failure to brake at all is a bad sign.
It was obvious when Telsa first released this that people were using it inappropriately. I think that they have released updates to encourage better use, but I don't know how successful they were.