You're looking at Less Wrong's discussion board. This includes all posts, including those that haven't been promoted to the front page yet. For more information, see About Less Wrong.

bogus comments on Open Thread for February 11 - 17 - Less Wrong Discussion

3 Post author: Coscott 11 February 2014 06:08PM

You are viewing a comment permalink. View the original post to see all comments and the full post content.

Comments (325)

You are viewing a single comment's thread. Show more comments above.

Comment author: fubarobfusco 12 February 2014 02:57:11AM 3 points [-]

This is a tangent, but since you mention the "good founders started [programming] at 13" meme, it's a little bit relevant ...

I find it deeply bizarre that there's this idea today among some programmers that if you didn't start programming in your early teens, you will never be good at programming. Why is this so bizarre? Because until very recently, there was no such thing as a programmer who started at a young age; and yet there were people who became good at programming.

Prior to the 1980s, most people who ended up as programmers didn't have access to a computer until university, often not until graduate school. Even for university students, relatively unfettered access to a computer was an unusual exception, found only in extremely hacker-friendly cultures such as MIT.

Put another way: Donald Knuth probably didn't use a computer until he was around 20. John McCarthy was born in 1927 and probably couldn't have come near a computer until he was a professor, in his mid-20s. (And of course Alan Turing, Jack Good, or John von Neumann couldn't have grown up with computers!)

(But all of them were mathematicians, and several of them physicists. Knuth, for one, was also a puzzle aficionado and a musician from his early years — two intellectual pursuits often believed to correlate with programming ability.)

In any event, it should be evident from the historical record that people who didn't see a computer until adulthood could still become extremely proficient programmers and computer scientists.

I've heard some people defend the "you can't be good unless you started early" meme by comparison with language acquisition. Humans generally can't gain native-level fluency in a language unless they are exposed to it as young children. But language acquisition is a very specific developmental process that has evolved over thousands of generations, and occurs in a developmentally-critical period of very early childhood. Programming hasn't been around that long, and there's no reason to believe that a critical developmental period in early adolescence could have come into existence in the last few human generations.

So as far as I can tell, we should really treat the idea that you have to start early to become a good programmer as a defensive and prejudicial myth, a bit of tribal lore arising in a recent (and powerful) subculture — which has the effect of excluding and driving off people who would be perfectly capable of learning to code, but who are not members of that subculture.

Comment author: bogus 12 February 2014 12:14:07PM *  2 points [-]

This is a tangent, but since you mention the "good founders started [programming] at 13" meme, it's a little bit relevant ...

There is a rule of thumb that achieving exceptional mastery in any specific field requires 10,000 hours of practice. This seems to be true across fields, in classical musicians, chess players, sports players, scholars/academics etc... It's a lot easier to meet that standard if you start from childhood. Note that people who make this claim in the computing field are talking about hackers, not professional programmers in a general sense. It's very possible to become a productive programmer at any age.