This is a tangent, but since you mention the "good founders started [programming] at 13" meme, it's a little bit relevant ...
I find it deeply bizarre that there's this idea today among some programmers that if you didn't start programming in your early teens, you will never be good at programming. Why is this so bizarre? Because until very recently, there was no such thing as a programmer who started at a young age; and yet there were people who became good at programming.
Prior to the 1980s, most people who ended up as programmers didn't have access to a computer until university, often not until graduate school. Even for university students, relatively unfettered access to a computer was an unusual exception, found only in extremely hacker-friendly cultures such as MIT.
Put another way: Donald Knuth probably didn't use a computer until he was around 20. John McCarthy was born in 1927 and probably couldn't have come near a computer until he was a professor, in his mid-20s. (And of course Alan Turing, Jack Good, or John von Neumann couldn't have grown up with computers!)
(But all of them were mathematicians, and several of them physicists. Knuth, for one, was also a puzzle aficionado and a musician from his early years — two intellectual pursuits often believed to correlate with programming ability.)
In any event, it should be evident from the historical record that people who didn't see a computer until adulthood could still become extremely proficient programmers and computer scientists.
I've heard some people defend the "you can't be good unless you started early" meme by comparison with language acquisition. Humans generally can't gain native-level fluency in a language unless they are exposed to it as young children. But language acquisition is a very specific developmental process that has evolved over thousands of generations, and occurs in a developmentally-critical period of very early childhood. Programming hasn't been around that long, and there's no reason to believe that a critical developmental period in early adolescence could have come into existence in the last few human generations.
So as far as I can tell, we should really treat the idea that you have to start early to become a good programmer as a defensive and prejudicial myth, a bit of tribal lore arising in a recent (and powerful) subculture — which has the effect of excluding and driving off people who would be perfectly capable of learning to code, but who are not members of that subculture.
Humans generally can't gain native-level fluency in a language unless they are exposed to it as young children.
The only aspect of language with a critical period is accent. Adults commonly achieve fluency. In fact, adults learn a second language faster than children.
If it's worth saying, but not worth its own post (even in Discussion), then it goes here.