I've taught C, Java and Python at a university and (a little) at the high school level. I have noticed two simple things that people either surmount or get stuck on. The first seems to be even a basic ability to keep a formal system in mind; see the famous Dehnadi and Bornat paper. The second, I have heard less about: in programming, it's the idea of scope.
The idea of scope in almost all modern programming languages goes like this: * A scope starts at some time (some place in the code), and ends somewhere later. * A scope can start before another ends; if so, it has to end before the "outer" scope. * Inside a scope, objects can be created and manipulated; generally even if another scope has started. * Unless something special is done, objects no longer exist after a scope ends. * Pivotally (this seems to be the hardest part), a objects can be created with one name in an outer scope and be referred to by a different name in an inner scope. Inner scopes can likewise create and manipulate objects with the same names as objects in an outer scope without affecting the objects in that outer scope.
It's really hard for me to think of an analogous skill in the real world to keeping track of N levels of renaming (which may be why it gives students such difficulty?). The closest I can think of is function composition; if you don't have to pick your way through symbolically integrating a composed function where the variables names don't match, I have pretty high confidence that you can manage nested scopes.
EDIT: There are two other, well-known problems. Recursion and pointers. I've heard stories about students who were okay for a year or two of programming courses, but never "got" recursion or, never understood pointers, and had to change majors. I've seen students have enormous difficulty with both; in fact, I've passed students who never figured one or the other out, but managed to grind through my course anyway. I don't know whether they dropped out or figured it out as their classes got harder---or just kept faking it (I had team members through grad school that couldn't handle more than basic recursion). I'm not inclined to classify either as "programming gear" that they didn't have, but I don't have data to back that up.
Subscribe to RSS Feed
= f037147d6e6c911a85753b9abdedda8d)
My assumption was that people who can't seem to learn to program can't get to the gut-level belief that computers don't use natural language-- computers require types of precision that people don't need.
However, this is only a guess. Would anyone with teaching experience care to post about where the roadblocks seem to be?
Also, does the proportion of people who can't learn to program seem to be dropping?
On the other hand, I did the JavaScript tutorial at Codacademy, and it was fun of a very annoying sort-- enough fun that I was disappointed that there only seemed to be a small amount of it.
However, I didn't seem to be able to focus enough on the examples until I took out the extra lines and curly parentheses-- I was literally losing track of what I was doing as I went from one distant line to another. If I pursue this, I might need to get used to the white space-- I'm sure it's valuable for keeping track of the sections of a program.
My working memory isn't horrendously bad-- I can reliably play dual 3-back, and am occasionally getting to 4-back.
If there are sensory issues making programming difficult for a particular person, this might be hard to distinguish from a general inability.
I've taught courses at various levels, and in introductory courses (where there's no guarantee anyone has seen source code of any form before), I've been again and again horrified by students months into the course who "tell" the computer to do something. For instance, in a C program, they might write a comment to the computer instructing it to remember the value of a variable and print it if it changed. "Wishful" programming, as it were.
In fact, I might describe that as the key difference between the people who clearly would never take another programming course, and those that might---wishful thinking. Some never understood their own code and seemed to write it like monkeys armed with a binary classifier (the compiler & runtime, either running their program, or crashing) banging out Shakespeare. These typically never had a clear idea about what "program state" was; instead of seeing their program as data evolving over time, they saw it as a bunch of characters on the screen, and maybe if the right incantations were put on the screen, the right things would happen when they said Go.
Common errors in this category include: * Infinite loops, because "the loop will obviously be done when it has the value I want". * Uninitialized variables, because "it's obvious what I'm computing, and that you start at X". * Calling functions that don't exist, because, "well, it ought to". * NOT calling functions, because "the function is named PrintWhenDone, it should automatically print when the program is done".
These errors would crop up among a minority of students right up until the class was over. They could be well described by a gut-level belief that computers use natural language; but this only covers 2-6% of students in these courses*, whereas my experience is that less than 50% of students who go into a Computer Science major actually graduate with a Computer Science degree; so I think this is only a small part of what keeps people from programming.
*In three courses, with a roughly 50-person class, there were always 1-3 of these students; I suspect the median is therefore somewhere between 2 and 6%, but perhaps wildly different at another institution and far higher in the general population.