48

Presumably you read Less Wrong because you're interested in thinking better.

If so, you might be interested in another opportunity to improve the quality of your thinking: learn to code.

Like nothing else, coding forces you to identify flaws in your thinking. If your thinking is flawed, your program won't work, except by accident. There's no other discipline quite like this. If you're a mathematician or physicist and you solve a problem wrong, your paper won't tell you. Computer programmers have to measure their thinking against the gold standard of correctness constantly. The process of uncovering and fixing flaws in a program, usually called "debugging", typically takes up the majority of the time spent on software projects.

But this is only the beginning. You've probably heard something like "there are some problems that humans are good at and some problems that computers are good at". This is true. And once you learn to code, you'll be able to exploit computers to solve the problems they are good at. Having a computer to write software with is like having a hi-tech mental exoskeleton that lets your mind run harder and jump higher. Want to know what the second most common letter for an English word to end in is? That's a 15 line script. Tired of balancing chemical equations for your homework? Automate it.

Two more benefits that have less to do with thinking better:

  • Employment. You probably don't need a computer science degree. I know of two Less Wrong users who learned to program after college and got jobs at Silicon Valley startups with just a project or two on their resume. (MBlume and FrankAdamek.) See Advice on Getting a Software Job by Tom McCabe for more on this possibility.
  • Productivity software. Writing your own is much nicer than using stuff made by other people in my experience. The reason there are so many to-do list applications is because everyone's needs are different. If you use the terminal as your interface, it doesn't take much effort to write this stuff; you'll spend most of your time figuring out what you want it to do. (Terminal + cron on Linux with JSON log files has worked great for my needs.)

Having enough coding knowledge to be dangerous may take persistence. If you tried and failed in the past, you probably either got stuck and gave up because there was no one to help you, or you just didn't keep at it.

I've take two different introductory programming classes now to meet college requirements. The students in both seemed substantially less intelligent to me than Less Wrong users, and most were successful in learning to program. So based on the fact that you are reading this, I am pretty darn sure you have the necessary level of mental ability.

Starting Out

I recommend trying one of these interactive tutorials right now to get a quick feel for what programming is like.

After you do that, here are some freely available materials for studying programming:

  • Learn Python the Hard Way. I like Zed's philosophy of having you type a lot of code, and apparently I'm not the only one. (Other books in the Hard Way series.)
  • Eloquent JavaScript. No installation needed for this one, and the exercises are nicely interspersed with the text.
  • Think Python. More of a computer science focus. ("Computer science" refers to more abstract, less applied aspects of programming.)
  • Codecademy (uses JavaScript). Makes use of gamification-type incentives. Just don't lose sight of the fact that programming can be fun without them.
  • Hackety Hack (uses Ruby). Might be especially good for younger folks.
  • How to Design Programs. This book uses an elegant, quirky, somewhat impractical language called Scheme, and emphasizes a disciplined approach to programming. Maybe that will appeal to you. Structure and Interpretation of Computer Programs is a tougher, more computer science heavy book that also uses Scheme. You should probably have a good understanding of programming with recursive functions before tackling it.

Here's a discussion on Less Wrong about what the best programming language to start with is.

If you're having a hard time getting something up and running, that's a system administration challenge, not a programming one. Everyone hates system administration I think, except maybe system administrators. Keep calm, put your error message into Google, get help on a relevant IRC channel, etc.

Once you've got the basics, a good way to proceed is to decide on something you want to write and try to write it. If you don't know how to get started, start making Google searches. Soon you'll figure out the sort of libraries/frameworks people use to write your kind of program.

At first you may just be aping what others do. For example, if you want to learn something called "bleh", searching on Google for "bleh tutorial" is a great way to start. Finding a working program and modifying it to see out how it changes is another good option. Soon you'll graduate to appropriating sample code from documentation. As you write more code and see more of the software landscape, you'll be better prepared to craft original approaches to writing software.

See also: On the Fence? Major in CS, Teach Yourself Programming in 10 Years, Computer Science and Programming: Links and Resources.

New to LessWrong?

New Comment
77 comments, sorted by Click to highlight new comments since: Today at 3:44 PM
Some comments are truncated due to high volume. (⌘F to expand all)Change truncation settings

PSA: if you start learning to code and get stuck on some technical issue, feel free to PM me and I'll try to help :-)

Great post!

I strongly recommend Project Euler as a source of progressively trickier hurdles with instant feedback. It's especially helpful if (like me) you know a lot of math but little programming at the start.

I should've made it more clear that once you've got programming basics, there are (very roughly speaking) two ways to go from there. The first, which I emphasized more in the post, is to write some kind of Modern Software™ like a web application, phone app, etc. This typically entails reading tutorials and other documentation, dealing with system administration issues, etc. Basically, scale humankind's ever-growing mountain of software and add to the top of the pile.

The second way is to improve your conceptual thinking by learning new features of your language, solving challenges from Project Euler, tackling fields that require programming skill as a prerequisite (like algorithms and machine learning), learning new and different programming languages to see what they teach you, etc.

The first is more frustrating and more economically valuable, while the second requires thinking harder, is more intellectually rewarding, and builds your skills more in the long run. I guess I emphasized the first because writing Modern Software™ isn't actually that hard if you can deal with the frustration, but it took me years to realize that, so I wanted to tell others early.

I guess maybe another fie... (read more)

-5Soki12y

Agreed about debugging. Dijkstra said something like, "Debugging is the process of taking bugs out; therefore programming is the process of putting them in." Or consider the revelation of Maurice Wilkes:

As soon as we started programming, we found to our surprise that it wasn't as easy to get programs right as we had thought. Debugging had to be discovered. I can remember the exact instant when I realized that a large part of my life from then on was going to be spent in finding mistakes in my own programs.

One of the biggest misconceptions I notice in non-programmers is that they think it's mostly typing: "telling the computer what to do." Depending on the task, it's a lot of reading, or researching with Google, or staring intently at text on the screen for long minutes until you figure out what you're doing wrong.

Knowing how to program can often be secret sauce for semi-skilled labor. This resourceful individual claims to have tripled their salary by writing a program to mostly-automate their work, thereby collecting the lion's share of bonuses. At my company, I've saved whole person-weeks of others' time with really simple stuff.

My first software job was wo... (read more)

4John_Maxwell12y
Just for the benefit of bystanders, most computer programs to do what I described are far easier to understand than the one wmorgan wrote.
6fubarobfusco12y
It's actually quite straightforward. It's just written in a language that most coders don't use, and moreover it uses data types that most "coderly" languages don't have. It would be pretty obvious to many experienced Unix sysadmins, though; there's nothing here that a sysadmin wouldn't use in doing log analysis or the like. The most accessible data types in Unix shellscript are strings, semi-lazy streams of strings, and processes. A shell pipeline, such as the above, is a sequence of processes connected by streams; each process's output is the next one's input. 1. cat /usr/share/dict/words | \ Create a stream of strings from a single file, namely a standard list of English words. 2. sed -e 's/.*\(.\)/\1/' | \ For each word, extract the last letter. 3. tr A-Z a-z | \ Change any uppercase letters to lowercase. 4. sort | \ Sort the stream, so that all identical letters are adjacent to one another. 5. uniq -c | \ Count identical adjacent letters. 6. sort -rn Sort numerically so that the letters with the highest counts come first. It is not really clear to me that this is particularly less expressive than the straightforward way to do an equivalent operation in modern Python, as follows: import collections c = collections.Counter() for line in file("/usr/share/dict/words"): line = line.strip().lower() if not line: continue c[line[-1]] += 1 for (ltr, count) in c.most_common(): print ltr, count
1shokwave12y
Ways in which it might be less expressive: using the small, efficient pieces of Unix takes a while to be conceptually similar to using the different functions of a programming language. Using a regex. (Inferential distance is hard to estimate; I know I like to hear where I'm incorrectly assuming short distances; I hope you do too).
2arundelo12y
Unnecessary use of cat! (And backslashes!) :-)
0wmorgan12y
The pleonastic cat was intentional (I like the expressiveness), but I didn't know that about pipes. Very cool!
0arundelo12y
It works with && and || too: touch /var/log/php.log && chown nobody:medium /var/log/php.log && chmod 640 /var/log/php.log

To paraphrase Jaynes, philosophers can say anything they want, because they don't have to do anything right.

The magic of coding, or in general creating something that does something, is that reality tests your ideas. Oh yeah, and you actually accomplish something outside of your head, or the heads of others at a cocktail party.

Coding is great in that the turnaround time for testing your ideas is so fast. The faster the feedback loop, the faster the learning.

Although the OP mentioned debugging, I'd stress and elaborate the point for people just learning to program. He says programming forces you to think. Particularly when you're learning, or debugging, I'd say sometimes you need to stop trying to figure it out, and start just trying things out. Fiddle with it and see what happens. That's the way to improve your model of how it works, so that next time you will have a better sense of what to do.

2lukeprog12y
Do you know which page of Jaynes you are paraphrasing, by chance?
8buybuydandavis12y
pg. 144, middle of the page, last paragraph before 5.8 Bayesian Jurisprudence http://books.google.com/books?id=tTN4HuUNXjgC&pg=PA144 Normally I'd cut and paste the quote, but google books won't let me copy, and I'm too lazy.
3Pablo11y
For some folks, having to click on a link is a trivial inconvenience, so here's the relevant part:
0buybuydandavis11y
Thanks. Better to light a candle than curse the darkness. Such is my laziness, that I didn't pay attention to Jaynes' elaboration to the quote, which is pretty good too.

As someone who can program well for a beginner ( Linux user, scripts very well; otherwise Python, C, C++ and MATLAB are what I've used), what advantage is there to be gained in learning more? I'd really like to; I'm trying to all the time, but I have no real problems I need to code to solve, or they are simply much too big. Can you suggest some benefits that I'd gain from a moderate skill increase?

0shokwave12y
Can you give me an example or two of a problem that is much too big?
2TerminalAwareness12y
Absolutely; I certainly do have things I'd love to code. * I rely heavily on a python notes taking program, Zim, which could use some help implementing more features like tables, or an android port. * Linux could use an extended nutrition, food, and exercise tracking program * I've toyed with the idea of trying to pull components together under KDE and link food purchases to a pantry tracking program to a nutrition tracking program to a health logging program * The BIOS on my laptop is broken under Linux in many ways; I've seen and attempted to decompile and repair it, but working with a >10000 line decompiled program exhausted me quickly. * Everyone needs their own window manager, right? I'd even started extending xfwm4 with wmctrl in bash scripts, but it was a bit silly and in the latest release the key feature was replicated officially, and far more elegantly. * qubes-os is (I hope) the future of secure but practical computing but my current hardware makes running it less than useful I could probably work a good bit on any of the above projects, but I don't think I could succeed, since there is so little benefit to me and I'm at a #1-20 Project Euler level.
0shokwave12y
All the projects you list are probably too challenging except the nutrition/exercise/food tracking program, I'd wager. A suggestion on how to go from Project Euler to stronger things quickly: try some of the Google AI challenges. Planet Wars is a good spot to start. I found working on it outside of the competition to be very interesting; coding actual bots is not much more challenging than Project Euler, you can increase the difficultly level with "ooh, I'd really like to see my bot do x", and when you start thinking about how to exploit the game you end up digging through their code and learning a lot about bigger projects. More generally, these kinds of competitions where you submit a simple piece of code to a more complex piece are a great way to step up in skill (as long as you don't try to actually compete just yet - I found that stress and time constraint to be counterproductive).

Like nothing else, coding forces you to identify flaws in your thinking. If your thinking is flawed, your program won't work, except by accident. There's no other discipline quite like this. If you're a mathematician or physicist and you solve a problem wrong, your paper won't tell you. Computer programmers have to measure their thinking against the gold standard of correctness constantly. The process of uncovering and fixing flaws in a program, usually called "debugging", typically takes up the majority of the time spent on software projects.

... (read more)
0private_messaging12y
The rationalists don't seem to be more rational than average, either. Also, what's wrong with 'there's always a reason' when debugging? With the truth being relative, who knows what he meant, the issue is that symbols aren't grounded in math (which is where people tend to consider the truest truths to be) so it isn't really absolute in any meaningful sense, and is an argument over semantics anyway.
4Nornagest12y
I read that as a contrast: "there's always a reason" as reductionist thinking, "truth is relative" as anti-reductionist. Truthfully the two aren't incompatible at all; anti-reductionists tend to think of formal systems (like programming languages) as a limited magisterium within a vastly less explicable universe, which is what gives us statements like "you can't program a soul". Wouldn't make too much sense to take a hard line against AI if mathematics contained the same magic that people allegedly possess. Of course, even that gets muddled some by typical abuses of Gödel and Heisenberg, but that's another post entirely...
-2private_messaging12y
There is also the irrational/sloppy reductionism where you mistakenly believe that something irreducible is reducible rather than illusionary. E.g. you can believe in some absolute truth, because you feel that there is absolute truth, and you are 'reductionist' as in ideology, and so have to believe that absolute truth is reducible (belief in a verbal statement). Whereas a reductionist may think whenever absolute truth can be reduced to something more basic, and see no way of doing that, and thus declare truth relative and illusionary (so that the only the feeling of absolute truth is reducible to interactions between neurons). edit: and an anti-reductionist would take the feeling that there is absolute truth, proclaim that there really is absolute truth, see no way to reduce absolute truth to anything more basic, and then proclaim that reductionism is false. Ultimately two latter things are the only internally sound approaches, while first is not coherent. edit: actually, it's best reworded with soul example. Reductionist feels that he has soul, doesn't see a way to reduce soul to e.g. elementary math (and sees impossibility of such reduction), and claims that feeling of soul is some sort of illusion (and seeks a way to reduce the feeling of soul). That is internally consistent. The anti-reductionist feels that he has soul, proclaims that soul really exists, doesn't see a way to reduce soul to e.g. elementary math, doesn't buy into the feeling being an illusion, proclaims that reductionism is false and the soul is out of this world or something. This is also internally consistent. The incoherent reductionist feels that he has soul, proclaims that soul really exists, proclaims that soul absolutely must be reducible to elementary math (while not really having reduced it to elementary math or even seen any way to), and doesn't notice that it would suffice to reduce the feeling of soul. That is just sloppy.
0Dreaded_Anomaly12y
Everyone compartmentalizes.
[-][anonymous]12y50

I recommend visiting Udacity and enrolling in one of their classes.

The classes are all free and are university level. They take the form of have high quality lectures, broken up into videos a few minutes long spiced with quizzes and programming exercises. They will teach you how to program in Python as well as cool stuff like computer science or even some machine learning and AI. Each class also has a goal, like building a search engine or a self-driving car, so one can see the relevance of what he is learning to interesting tasks.

Its advantage over other... (read more)

I recommend Eric Raymond's How To Become A Hacker. He suggests: first learn HTML markup. Then learn Python.

For a great many purposes, you can just stop there - there's an excellent chance that Python will do everything you ever want to do. However, if you want to get more deeply in to it, learn Java. Other languages that might be good to learn after that: Scheme, C, and if you really want to push the boat out, Haskell.

5JenniferRM12y
Edited in response to voting: Wait, what? I was offering support for ciphergoth's claim that haskell was something "all the cool kids" would have experience with seven years from now, and hence it was probably worth playing with. If people liked my quote they should like ciphergoth's moreso because he said it first and put it in context.

If you're a mathematician or physicist and you solve a problem wrong, your paper won't tell you.

But it's comparatively easy to spot errors (or dead ends) as you go, and then you try to find ways of fixing them (which is harder). It seems similar to programming in this respect, maybe the greatest difference is that in programming you often work with a vast code base, and the problems can occur because of its peculiar properties you knew nothing about (or have long forgotten). Also, debugging is typically much easier in terms of cognitive load (while design can get challenging).

-2shokwave12y
It takes discipline, and it seems to only fit a certain type of mind, but test-driven development will let you spot errors as you go.

If you're motivated to learn coding because of the effects it has on your thinking, there is absolutely no better place to start than The Little Schemer. Seriously.

I am tempted, in fact, to claim that The Little Schemer (and Scheme, and SICP, and Lisp...) will teach an even higher standard of thinking than other programming languages. I have some evidence for this, in the form of Paul Graham's essay on the matter.

The skills underlying programming ability, figuring out what problems can be efficiently attacked by programming and how to break down problems into parts amenable to programmed solutions don't get talked about as much as coding. Jeannette Wing's article Computational Thinking (pdf) from a few years back talks about this.

For example, if you want to learn something called "bleh", searching on Google for "bleh tutorial" is a great way to start.

Too bad that Google won't tell you which tutorials are excellent, which are decent, and which suck. Looking for the FAQ for the Usenet newsgroup for the programming language you're interested in is usually better.

Long comment here.

Short comment:

I just graduated a year early from high school and I'm doing a CS major in the Fall. I took AP computer science at my high school and loved it. I'd love to learn more, but I feel like the links you post above help me learn bits and pieces (trees), but I don't really know where that's leading (forest). In essence, I want to learn about coding before I learn more coding, so that I can direct my studies in the directions I want to take them.

I think you should elaborate on this. This seems to be the first project I would want t... (read more)

2John_Maxwell12y
I don't know of any good resources for learning what the forest is like. As far as I can tell, everyone who knows what the forest is like learned by learning about more and more trees. Assuming your AP computer science class used Java, a good next step is to learn Python (probably using some resource like Dive into Python written for people who already know how to program) and learn the command line (using the Hard Way book on the topic or http://code.google.com/edu/tools101/linux/basics.html or something). Then you can take a shot at following the instructions in the rest of this comment. Using cron: http://lesswrong.com/lw/2dg/applying_behavioral_psychology_on_myself/267c Note that it assumes you're on Linux. I don't know the best way to duplicate my results on Windows. If you're on Ubuntu, a good next step is to type "man gnome-terminal" from the command line and figure out how to use flags to cause gnome terminal to run an arbitrary program when it starts. Then you can substitute the command you construct for the Firefox command in the original cron example. With lots of Google searches and persistence and trial and error, it should be possible to set things up so that you get a window popping up every hour with a random item from your to do list. You can complicate things from there as you desire. For example, figure out how to input some number of minutes that the window should wait before popping up again. There's tons of stuff you could try out. Persistence example: if you get gnome terminal to start running a command, the terminal will close as soon as the command terminates. So you'll need to ask the user for input in the last line of your script. (In user experience terms, this will translate into you pressing enter to close the terminal window that pops up.) This is one of those things that could take an hour or so to figure out. In general, you want to start by getting extremely simple examples to work and gradually modify them, making sure the
1Viliam_Bur12y
I think the "forest" is knowing different programming metaphors, and knowing when to use which one. A proper choice of metaphor will allow you to solve the problem more efficiently -- which means that the program is not only quickly written, but also easier to understand (though sometimes only for people familiar with given metaphor). By metaphors I mean things commonly called "programming paradigms" and "design patterns"; the former are more abstract, the latter are more specific templates for solving given set of problems. It is good to know these individual metaphors, and to be aware that they exist and you can choose the one you need (instead of seeing every problem as a nail, because you have completed a good hammer-using lesson and you are a certified hammer expert). Different programming languages put emphasis on different metaphors, which is why a good programmer usually knows more than one programming language. A programming paradigm is an answer to question: What exactly is a program? * a sequence of commands for the computer * a collection of mathematical functions * a collection of facts and predicates (to be reasoned about by a standard algorithm) * a collection of small parts (objects) cooperating with each other * a collection of reactions "in situation X, do Y" Depending on your problem, a different paradigm is useful. A functional paradigm is useful when you want to calculate functions. A command (imperative) paradigm is useful for tasks you could best describe as series of steps. An object paradigm is great for a user interface, together with "if this is clicked, do this" reactions. Sometimes you should use different paradigms for different parts of your program; for example a program for solving mathematical equations, with a user interface. Sometimes different paradigms support each other; for example you can write a list of steps to do, and then describe additional aspects such as: "and by the way, every time you read from a file or wri

That's actually good rationality advice. I've been saying something like that before.

I would add to this a recommendation: learn to design software project, i.e. do software engineering at larger scale, ultimately design something new in a new way. There you will have to think of what would otherwise have been fuzzy and ill defined concepts when you do planning, and you'll have to do it correctly. There if your thinking is flawed or if the concepts you are thinking in are flawed(that is a huge source of failure), your plans won't be implementable.

You will ... (read more)

0thomblake12y
You assert this repeatedly. Is there evidence?
0private_messaging12y
You are asking for evidence that flawed thinking is easily affected by biases? What is the alternative hypothesis exactly (i.e. what do you want me to falsify), that it is not easily affected by biases? Or are you asking for evidence "that it may superficially make biases look like what makes flawed thought flawed"? The alternative that needs to be falsified is that it 'may not' ?
0thomblake12y
The standard model here is that biases are one of the major things that makes flawed thought flawed. You suggested that model is false; that it is an illusion resulting from the causality working the opposite direction. I'd like to see evidence that the observed correlation between flawed thinking and biases is due to "flawed thinking is easily affected by biases" rather than "biases cause flawed thinking".
1asr12y
If I understand right, the point was this: programmers routinely display flawed thinking (in the form of conceptual bugs) that don't seem to stem from cognitive biases. This evidence, if you haven't previously internalized it, should cause you to downwards-revise your estimate for the fraction of flawed thinking caused by biases.
0thomblake12y
Can you give an example?
2gwern12y
One example I've seen first-hand and suffered is confirmation bias. Beginning programmers, at least, when they run into a bug, do not try to disconfirm what they think the bug is, but instead try to confirm it. For example, they'll put print statements for a troublesome variable after the line they suspect is causing the bug rather than before it, to verify that it wasn't bogus before hand, or better yet, both before and after the suspect line.
3JGWeissman12y
I don't see how that is confirmation bias. Where does the beginning programmer discount or ignore disconfirming evidence? If the print statement shows the troublesome variable has the correct value following the suspect line, that is evidence against the suspect line being the bug. The problem in this case is that programmer is paying attention to only part of the meaning of that line being a bug. If it is a bug, it would transform correct state to incorrect state, and the programmer is only validating the resulting incorrect state, not the preceding correct state. Though, I will sometimes add a single debug statement, not to test some particular line of code, but to test if the bug is before or after the debug statement, so I can look more closely at the offending section and narrow it down further.
1private_messaging12y
Yep. Actually on meta level I see more confirmation bias right here. For instance gwern doesn't seek to disprove that it is confirmation bias, but seeks to confirm it, and sees how the print statement after suspect line wouldn't disconfirm the bug being on that line if the bug is above that line, but doesn't see that it would disconfirm the bug if the bug was below suspect line. (and in any case if you want to conclusively show that there is a bug on suspect line, you need to understand what the bug is, precisely, which is confirming evidence, and which requires you to know the state after and see if it matches what you think the bug would produce.) I do imagine that confirmation bias does exist in programming whenever the programmer did not figure out or learn how to do it right, but learning that there's this bias, and learning how to do it right, are different things entirely. The programmer that learned how to debug can keep track of set of places where the bug could be, update it correctly, and seek effective ways to narrow that down.
0arundelo12y
Maybe it isn't confirmation bias. But Wikipedia says that "[p]eople display [confirmation] bias when they gather or remember information selectively, or when they interpret it in a biased way". If that's a good description, then gwern's example would fall under "gather[ing]". The bias the programmer in gwern's example is exhibiting is the same one that makes people fail the Wason selection task -- searching for confirming evidence rather than disconfirming evidence. Edit: Retracting this because I think JGWeissman is right.
4JGWeissman12y
The programmer is indeed gathering evidence, but I don't see how they are gathering evidence in a way that is meant to produce confirming rather than disconfirming evidence. As I have explained, the test could show a correct value for the troublesome variable and be evidence against the suspect line being the bug. The test will be somewhat biased towards confirmation in that it is really testing if the suspect line has a bug or any earlier line has a bug, but I don't think this bias reflects the programmer seeking only confirming evidence so much as not understanding what they are testing. That is not the cause of failure in the Wason selection task. The problem is not using contrapositives, that is, realizing "even implies red" is logically equivalent to its contrapositive "not red implies not even", so to test "even implies red" you have to check cards that show an even number or a non red color. This is similar to the failure to use contrapositives behind the Positive Test Bias, which is itself similar to gwern's example in that it involves failure to test every aspect of the hypothesis.
4arundelo12y
You're right. I'm retracting the grandparent.
0private_messaging12y
Haven't seen programmers do this a whole lot. In any case this should get cured by one bug hunting session where the method horribly fails. Also i am not sure that this is 'confirmation bias' being cause of anything. The methodology of thought for determining where the bug is, has to be taught. For me to confirm that bug is where I think it is, takes two print statements, before, and after.
0thomblake12y
That's evidence against the proposition that I was looking for evidence for. ha ha.
1asr12y
Sure. A common source of bugs is that different developers or other stakeholders have different expectations. I say "this variable is height in feet", and you assume it's meters. I forget to synchronize access to a piece of state accessed from multiple threads. I get over-cautious and add too much synchronization, leading to deadlock. I forget to handle the case where a variable is null. None of those feel like cognitive biases, all are either common or infamous bugs.
0private_messaging12y
The mainstream model here in our technological civilization which we have to sustain using our ape brains, is that the correct thought methods have to be taught, and that the biases tend to substitute for solutions when one does not know how to answer the question, and are displaced by more accurate methods. The mainstream model helps decrease people's mistake rate in programming, software engineering, and other disciplines. Now, the standard model "here" on lesswrong, I am not sure what it really is, and I do not want to risk making a strawman. For example, if you need to build a bridge, and you need to decide on the thickness of the steel beams, you need to learn how to calculate that and how to check your calculations, and you need training so that you stop making mistakes such as mixing up the equations. A very experienced architect can guess-estimate the required thickness rather accurately (but won't use that to build bridges). Without that, if you want to guess-estimate required thickness, you will be influenced by cognitive biases such as framing effect, by time of the day, mood, colour of the steel beam, what you had for breakfast and the like, through zillions of emotions and biases. You might go ahead and blame all those influences for the invalidity of your estimate, but the cause is incompetence. The technological civilization you are living in, all it's accomplishments, are the demonstration of success of the traditional approach.
0thomblake12y
I'm not familiar with this "mainstream model". Is there a resource that could explain this in more detail?
-2private_messaging12y
Go look how education works. Engineers sitting in classes learning how the colour of the beam or framing effect or other fallacies can influence guess estimate of required thickness, OR engineers sitting in classes learning how to actually find the damn thickness?
0thomblake12y
So am I to infer that your answer to my question is "no"?
-1private_messaging12y
What I am saying is that you have enough facts at your disposal and need to process them. So the answer is 'yes'. If you absolutely insist that I link a resource that I would expect wouldn't add any new information to the information you already didn't process: http://en.wikipedia.org/wiki/Mathematics_education . Teaching how to do math. Not teaching 'how framing effect influences your calculations and how you must purify your mind of it'. (Same goes for any engineering courses, take your pick. Same goes for teaching the physicists or any other scientists).

I once started to learn Python, with Think Python, but soon succumbed to Trivial inconveniences like having to save my programs, change to the interpreter and then loading them. Learn Python the Hard Way advises to do this, too. Is there a free or commercially available program which any of you could recommend that avoids these things without interfering in the learning process? Or any other way to get over this hurdle without overpowering it?

7John_Maxwell12y
Typically people use a keyboard shortcut to save, which makes it less painful. For running your program, try typing python yourprogram.py from your console/terminal/whatever after having navigated (by typing "cd directoryname" and "dir"/"ls" repeatedly) to the directory where your program is. After having typed the execution command once, you can just push the up arrow on your keyword to restore that command on the command line. So under ideal conditions, executing your program should consist of four actions: pushing a keyboard shortcut to save, switching to a terminal window, pushing the up arrow key, and pushing enter. Probably the instructions I just gave you will be hard to follow because they are awfully compressed and I don't know your platform. So feel free to tell me if you run into a problem. You may wish to make an explicit effort to learn the command line; I believe there is a Hard Way book about it. Another option is to use the editor that comes shipped with Python, sometimes referred to as IDLE. Once you figure out where it is and start it running, go to the file menu and choose new document or something like that. Then press F5 at any time. It should prompt you to save and then run your program. (If I recall correctly.) I'm sure there's something unclear here, so please respond if you get stuck.
0Tripitaka12y
Thank you very much for your elaborate reply. Unfortunately I do seem to have miscommunicated; those steps as described by you are exactly what I dread. I feel like I do not know enough to ask the right questions, so: I want a program which lets me debug with as less actions as possible while also giving me easily accessible help/documentation. They seem to be called IDEs.
4lsparrish12y
IDLE (which comes with Python) is an IDE. It simplifies testing to the point where you pretty much just hit F5 then hit enter to give it permission to save, and you see the result right away. Alt-tab brings the file you are editing back to the front when you are ready to fix the bug or continue hacking. I've used it on Windows and Linux and the experience is pretty much exactly the same either way. There is also a help menu which takes you to the documentation website where you can search for whatever you are looking for.
2shokwave12y
This may or may not be helpful, but it sounds like you might have a use for guard which is unfortunately quite obtuse to set up, but once it's running, it will automatically perform arbitrary tasks each time the file changes. If I plan to debug something, I tend to go build a Guardfile that runs that something every time I change it. Now the steps are: 1. Cmd-S to save. 2. Lazily cast eyes over the other half of my screen to see results.
1dbaupp12y
Often, the best (not necessarily easiest, admittedly) way to do this is to have a web browser with tabs open to the documentation for the libraries you are using (doing this method gets easier and much faster with practice, as you can hold more and more of your program in your head). However, for Python, you could experiment with some of these editors. Just glancing over it, the open source editors Eric and DreamPie and the proprietary Komodo look nice: they appear to offer auto-completion and pop-up documentation, and at least Eric and Komodo have built-in graphical debuggers.
0Tripitaka12y
The perfect solution for my problem got listed here. Thanks again for all of you!
[-][anonymous]12y00

Great post, upvoted. I'm currently majoring in math and thinking about adding a CS major/minor for when the time comes. Using these resources may give me better feel for if it's something I'm cut out for.

Great post.

Coding doesn't directly improve rational thinking, but it improves logical thinking, because it is basically a generalized approach to problem solving. It is also a skill that will lead to a much better understanding of topics like game theory and Friendly AI, and it seems pretty obvious to me that in the future, more and more interesting and rewarding jobs will require coding skills. There was a time when secretaries had to use pen and paper. Then they had to learn how to use a typewriter. Then they had to learn how to use MS Word and Outlook. ... (read more)

5asr12y
I don't agree with the language advice. Different languages teach different things. C is a good language for learning how the machine works, what memory looks like at a low level, and so forth. It's lousy for getting most practical work done in. Python is a nice language for munging text and other data. It's pretty good these days for numerical work, graph generation, and so forth (thanks to numpy and matplotlib.) JavaScript is good if you want to use a web browser to interact with your programs, and if you're prepared to also learn HTML alongside learning to program. My sense is that Java/C#/ObjectiveC/C++ are too complex to be really good "first languages". Useful professionally, useful for many kinds of programming, but not great for learning. There are a lot of good intro-programming resources for Java, C, and Python. My impression is that Java and Python are the most popular "first languages" academically. I don't believe Ada has nearly as much traction in that niche, which means there will be less support.. I think probably Python is the best bet, all-in-all.
-3Andreas_Giger12y
C++ is a better language than C in every single regard, including to learn. You don't need to learn OOP or exception handling to use C++, but you can still use proper strings, streams, and so on. There is absolutely no reason to use C rather than C++, except when you're building libraries for existing C architectures. The only thing C teaches that C++ doesn't is bad habits. If you ever had to work with C++ code written by a C coder you know what I mean. Python is a nice language, but if you want to learn how to code for the sake of improving your quality of logical thinking, I don't see any advantage it has over scripting languages, which are easier to learn. Same thing with Javascript. By the way, Pascal was specifically developed as a language for teaching, and Ada improved on that. The only reason schools today mostly don't teach Pascal/Ada anymore is because C (and later C++) emerged as the dominant language in the industry, mostly due to its performance and because you can go all the way down to assembly if you want to. So a language great for teaching was largely abandoned in favour of a language great for making money. Similar things are now happening with web-languages like Java, Javascript, HTML5, PHP, ... So I guess it's best to decide what your priorities are and proceed from there.
4[anonymous]12y
Why are people down voting this and other comments? They are on topic and are well and persuasively written. People might disagree but surely replying is a better way of doing that than voting down?
2fubarobfusco12y
The Unix line editor ed (which nobody uses any more) and the typesetting system roff (whose descendant nroff is today only used for Unix manual pages) were once used by the patent attorneys at AT&T.
2radical_negative_one12y
What specifically is wrong with Ruby's syntax? (I don't know much about comparative programming languages.)
1Andreas_Giger12y
It's not that it's wrong or bad, just that it's unusual in some ways and generally not very readable. This comes primarily from Ruby treating practically everything as objects. Also, you'll be using more characters like # and @, which makes learning more difficult and frustrating. You can do without these in most languages as long as you don't use pointers. I'm not sure what you refer to when you say "comparative programming languages"...
2wedrifid12y
I love this feature. Apart from allowing some amazing library implementations it just leaves me with a warm tingly feeling inside.
2TheOtherDave12y
I still have warm memories of, when I was first teaching myself SmallTalk, trying to look up the SmallTalk equivalent of a for loop in a reference chart and being unable to find it, and later discovering that "to: by: do:" was defined as a method on class Integer. This delighted me in ways difficult to express.
2handoflixue12y
Depends on your goal. C# is wonderful for dipping your toes in to coding and immediately being able to make something useful to yourself. Javascript is an incredibly flexible language that doesn't require anything more than a browser and text editor. Ada is great if you want to invest in a more hard core understanding of how code works. I personally start people on either Javascript (no messing about with installing compilers, no need to understand types) or C# (amazing UI, perfect for a novice to lay out applications in - most people are used to windows, not terminals!!)
1Andreas_Giger12y
C# is a great language, but not a good starting language because you need to deal with a lot of secondary elements like header files, Visual Studio with its unintuitive solution system, or the .NET framework before you can code anything of practical value. If you want to learn coding for the sake of improving the quality of your logical thinking, low-level languages are the way to go, and I'm not aware of any language that teaches this better than Ada. If you want to see quick results, go learn a scripting language. They're all pretty much the same; Lua just seems to be the most popular these days. There are also lots of "esoteric" languages that are designed to fit specific (often absurd) programming philosophies rather than maintain functionality; I know there are some that aim to make coding as painful as possible (Brainfuck and Whitespace come to mind), but there may also be some that teach programming logic especially well. I'm not particularly knowledgeable about this huge field of languages, so I leave recommendations to someone else.
0handoflixue12y
I'm not sure when you last used C#, but solutions are only used if you want to group 2+ separate projects together, are fairly intuitive (there's a single unintuitive bit - running the main project doesn't rebuild the others, but it DOES pop up a warning to that effect), and I don't think I've ever seen someone struggle with them outside of the complexities of a multi-user office where the solution file is stored in subversion (solved by not storing solution files in SVN!) Equally, I'm not sure why the ".NET framework" would add any complexity. The ability to look at a control in a UI and see all of it's attributes is something most people find a lot more intuitive. The ability to double-click a control and add an obvious default event is also very helpful, in my experience with teaching. Header files, I will concede. For basic programs, C# automatically includes the necessary ones, however - so it's not something that really comes up until you're doing more advanced code.
0RobertLumley12y
I agree with the second half of this. Pick a language that suits your needs. I use Visual Basic extensively, for interfacing with spreadsheets, and I wrote a bejeweled program (which sucked, since it took comparatively forever to get the color of a pixel and tell me what color the square was) in AutoHotkey, an awesome program that will let you remap hotkeys on your computer. I know a bit of PHP and C++, but the vast majority of what I do is in VB and AutoHotkey, because that's what's most accessible to me.