Less Wrong is a community blog devoted to refining the art of human rationality. Please visit our About page for more information.

Just Try It: Quantity Trumps Quality

62 Post author: atucker 04 April 2011 01:13AM

Followup to: Don't Fear Failure

In the same theme as the last article, I think that failure is actually pretty important in learning. Rationality needs data, and trying is a good source of it.

When you're trying to do something new, you probably won't be able to do it right the first time. Even if you obsess over it. Jeff Atwood is a programmer who says Quantity Always Trumps Quality

The ceramics teacher announced on opening day that he was dividing the class into two groups. All those on the left side of the studio, he said, would be graded solely on the quantity of work they produced, all those on the right solely on its quality. His procedure was simple: on the final day of class he would bring in his bathroom scales and weigh the work of the "quantity" group: fifty pound of pots rated an "A", forty pounds a "B", and so on. Those being graded on "quality", however, needed to produce only one pot - albeit a perfect one - to get an "A".

Well, came grading time and a curious fact emerged: the works of highest quality were all produced by the group being graded for quantity. It seems that while the "quantity" group was busily churning out piles of work - and learning from their mistakes - the "quality" group had sat theorizing about perfection, and in the end had little more to show for their efforts than grandiose theories and a pile of dead clay.

Where have I heard this before?

  1. Stop theorizing.
  2. Write lots of software.
  3. Learn from your mistakes.

Quantity always trumps quality. 

When it comes to software, the same rule applies. If you aren't building, you aren't learning. Rather than agonizing over whether you're building the right thing, just build it. And if that one doesn't work, keep building until you get one that does.

The people who tried more did better, even though they failed more too. Of course you shouldn't try to fail, but you shouldn't let the fear of it stop you from tyring.

I wouldn't go as far as to say that quantity always trumps quality, but where the cost of failure is low lots of failures that you pay attention to is a pretty good way of learning. You should  hold off on proposing solutions, but you also need to get around to actually trying the proposed solution.

I'm normed such that I'll spend more time talking about if something will work than trying it out to see if it works. The problem is that if you don't know about something already, your thoughts about what will work aren't going to be particularly accurate. Trying something will very conclusively demonstrate if something works or not.

Note:
I originally had this as part of Don't Fear Failure, but that post got too long.

Comments (82)

Comment author: [deleted] 04 April 2011 03:13:56PM 16 points [-]

i agree with the spirit of this post but I think you're leaving out an important part. It seems from the other comments that the experiment described never really happened. I think if it were tried it wouldn't really work out as described. I know if i had been in that class and been in the quantity group, I would have made some really crappy, really thick pots and been done as quickly as possible in order to goof off for the maximum amount of time. If I had been in the quality group I wouldn't have theorized about it, I just would have iterated on a lump of clay. Making a pot and then not firing it or anything, just mashing it back into a lump and starting over until I got to a really good one. I think I would have learned a lot more about pot making in the quality group.

What I have read and also experienced is that producing quantity is necessary but not sufficient for producing quality. If you want to get really good at something, rather than just getting somewhat good and then plateauing, you have to not only do it a lot, but you have to care deeply about how good you are doing, identify your weaknesses and work specifically to improve those. The problem with your story is that the quantity kids have no incentive to produce quality, so they probably just won't.

For example, I'm a self-taught programmer. So, in the beginning I wrote some truly atrocious code. I got good at coding when my livelihood depended upon producing and maintaining a large complicated system. The fact that I have to maintain and improve upon this codebase in the future made me a lot better because it made me suffer for my sins, and really care about not repeating them. And obviously having my income depend on it made me care about bugs and performance and stuff a lot more. If I had just been tasked with writing a lot of code, and was paid based on line count or something, I bet I would have written a lot of code and it would have all sucked.

Comment author: pjeby 04 April 2011 04:05:31PM 11 points [-]

The problem with your story is that the quantity kids have no incentive to produce quality, so they probably just won't.

No incentive? Don't you think they signed up for pottery class to, you know, learn how to do good pottery? That nobody wanted to be proud of their work?

(Btw, I heard this pottery story from a different source, and IIRC it was an adult pottery class, not a kids' one.)

Comment author: drethelin 04 April 2011 04:47:56PM 7 points [-]

I would say the majority of classes are signed up for because they're easy or part of required credits for a program.

Comment author: ameriver 05 April 2011 01:29:22AM 1 point [-]

I'm not sure that's true once you limit it to adult classes (far more likely to be taking the occasional class for fun), and particularly in the case of an art class.

Comment author: Mark_Neznansky 09 April 2011 05:57:18AM 0 points [-]

A "class for fun" implies that grade shouldn't matter to the participants, so, allegedly, the two different grading schemes wouldn't affect the participants' behavior.

But things (such as motivation) change as a person who did pottery for fun at home, goes to do pottery for fun in a class, don't they?

Comment author: [deleted] 04 April 2011 04:39:09PM *  2 points [-]

Ten years or ten thousand hours of "deliberate practice" - PDF link - is what's typically talked about these days to become one of the greats. Yeah, I know - ten years??? But "deliberate practice" of any length is better than messing around and will get you good, just not world champ level, long before ten years. That sounds like what you're talking about.

Comment author: NancyLebovitz 04 April 2011 03:56:49PM 0 points [-]

What I have read and also experienced is that producing quantity is necessary but not sufficient for producing quality. If you want to get really good at something, rather than just getting somewhat good and then plateauing, you have to not only do it a lot, but you have to care deeply about how good you are doing, identify your weaknesses and work specifically to improve those. The problem with your story is that the quantity kids have no incentive to produce quality, so they probably just won't.

A different angle from Kenny Werner, though this is about music-- his Effortless Mastery is about learning to be calm and non-judgmental, then gradually adding more challenge (achieve state, approach instrument, touch instrument, pick up instrument, make sound without aiming for good sound, then slowly add work on getting skillful without disrupting meditative state).

I haven't seen this approach applied to purely cognitive work like programming, though this doesn't seem impossible.

Comment author: TheOtherDave 04 April 2011 04:25:09PM 2 points [-]

This is also more or less the approach I adopted to physical therapy, after my stroke, to pretty good results.

Comment author: j_andrew_rogers 04 April 2011 06:11:50AM 15 points [-]

This story is about rapid iteration rather than quantity. The "quantity" is the detritus of evolution created while learning to produce a perfect pot. If a machine was producing pots it would generate great quantity but the quality would not vary from one iteration to the next.

There are many stories and heuristics in engineering lore suggesting rapid iteration converges on quality faster than careful design. See also: OODA loops, the equivalent military heuristic.

Comment author: atucker 04 April 2011 11:11:18PM 2 points [-]

That's true.

Originally, this post was part of "Don't Fear Failure". I intended for it to talk about low-cost failures, how practicing helps, and then do a bit of talking about how rationalists should be able to pay mindful attention to their mistakes in order to learn from them before they get right back up and try again.

So basically advocating rapid iteration after desensitizing people to failing.

However, I wasn't quite able to tie it all together and it just felt like it dragged on. So instead I split it up into a post which says that failing isn't that bad, and another about how practice pays off.

I could follow up with another post which more clearly spells out rapid iteration, but that might be a bit much. I'd rather move on to talking about perfectionism and unduly favoring the status quo.

Could be wrong though.

Comment author: suecochran 09 April 2011 11:00:18PM 12 points [-]

I'm brand new to Less Wrong, and very pleased that I found a topic right away that I have given a great deal of thought to already, since it's affected me throughout my life. I grew up with a mother who was constantly critical, and stingy or withholding of praise, with the result that my sister and I, who are both in our late 40s, still converse about the negative affect that my mother had on us when it comes to making mistakes, and attempting to do new things.

I used to feel that I was being scolded because I didn't know something I "ought to have known" in advance. I'm not referring to breaking some established rule in the family. I'm talking about being blindsided by sudden harsh words in a loud volume about something I had never heard about or considered before, something that I had had the nerve to "get wrong." This happened often enough that I began thinking that I had better not try to do things unless I knew EVERYTHING there was to know BEFORE I took any action. This, not surprisingly, had the affect of paralyzing me into inaction, fearing the reprisals for "mistakes", and of course, the judgment about what was right and what was wrong was based on what my mother thought about the issue, which was largely subjective.

As my sister and I got older, we began challenging my mother about her views and how she spoke to us. She was quite unhappy about being challenged by her daughters, who had once been so docile and albeit unhappily, accepting of her criticism and punishment. I did many years of therapy, starting in my early teens, I also read many books and articles about and took several courses in psychology, and did the est training (a personal growth seminar) in 1983. My sister and I both eventually came to the understanding that not only do you not have to know everything about an endeavor before embarking upon it, you CAN'T know what you need to know EXCEPT in the process of doing it. There is a reason it's called "trial and error". You don't learn anything when you know how to do something and get it right the first time. You learn when you make mistakes, and you find that you need to keep working at getting it, yes, "less wrong".

My son who is 15 now, is in his first year of High School at both his neighborhood HS and at an engineering program at a local magnet HS. His class had a project that incorporated their biology and engineering principles coursework that was due in January. They were assigned teams, and had to come up with a hemodialysis machine. There is a company that supplies the school with a synthetic blood, which is filled with a substrate, and the teacher provides a selection of components the kids have to use in their design. The team has to prove that their device filters the distillate material out of the synthetic blood.

Matthew told me that all the other teams designed a machine, and stuck to their original design, whereas he kept experimenting and coming up with different designs. His team didn't do any of the designing, they saw that Matthew knew how to take charge, and he just delegated to them the tasks that would assist him in completing the machine. They got very concerned that he kept changing his design, but when they asked him why he was doing that, he just said "I have to get it right, and until it's right, I won't use the design." He had the team present the paper explaining the machine they finally built, and he demonstrated how it worked. They got a 95%. I am very proud of him. I have worked hard on raising him without the same mistakes that my mother made (so I've given him a bunch of different mistakes ;) ) He has always been told that it's not only OK to make mistakes, they are necessary stepping stones on the pathway towards accomplishment and knowledge.

Comment author: Chala 04 April 2011 09:50:13AM 8 points [-]

Fear of failure is a big problem in my life right now. Its why I don't have a job, since I'm silly and am afraid of being rejected. This reframed something I think I already knew, but I'm sure it will help anyway. Time to really get on to things now.

Comment author: Kaj_Sotala 05 April 2011 07:42:34AM 6 points [-]

I found that looking for jobs got a lot easier when I stopped thinking of it as a process evaluating my worth for the employer. Instead I started thinking of it as a process where I look for the employer who deserves me by virtue of realizing just how valuable I am.

Comment author: NancyLebovitz 05 April 2011 12:59:55PM 1 point [-]

I've also seen a recommendation that authors think of themselves as displaying their work to editors rather than submitting it.

Comment author: Chala 05 April 2011 09:17:40AM 1 point [-]

It is stil frustrating to be ignored for a position which you would be more than adequate for, and which you are confident that you would be harder working at and more dilligent in than the hired help.

I guess thats one of the things that bothers me, having to jump through arbitrary hoops in a pointless process that fails to relate to reality. Also I probably just don't need/want a job that much ;)

Comment author: Kaj_Sotala 05 April 2011 10:06:19AM 1 point [-]

It is stil frustrating to be ignored for a position which you would be more than adequate for, and which you are confident that you would be harder working at and more dilligent in than the hired help.

In that situation, I try to just shrug and think it's their loss. :-)

Comment author: teageegeepea 05 April 2011 05:30:44AM 2 points [-]

What's your work/educational background?

I delayed looking for work in the past because I didn't actually need the money. I do have more expenses now, but not enough that I feel ambition for anything better (even within my company). I'm kind of okay with that, but I'd rather not slip up enough to lose this pretty decent gig and have to find another one. In case that reminds anyone of the motivation in Office Space, I am indeed a programmer like everyone else on the internet.

I think fear of uncomfortable interactions applies more in regular social situations for me. I've started practicing acting extroverted by just talking to strangers on the street or wherever, confident that I'll never see them again and there are no consequences of bad impressions I might make. Sometimes it results in talking too fast or unclearly though (that also happens at work).

Comment author: Chala 05 April 2011 09:13:56AM 1 point [-]

I have a degree in biomedical science, aka a totally useless degree. I'm going back to uni and am just looking for part time work in the mean time to make my time - since I'm finally at the stage of actually really needing the money.

Also I'm going back to uni to do computer science and become a programmer. I guess everyone on the internets realise is in that profession ;).

Regarding social interactions, I am actually (and unusually for less wrong) a very social and extroverted person. I spend much of my time at the moment socialising - ergo my need for money. Socialising is expensive.

However I am also one who has be conditioned to not try, my upbringing was such that any failure was focused on and any triumph taken for granted. Which is no excuse, but still I tend to avoid situations where I am tested at least when I am not guaranteed to triumph (e.g. I am totally 100% ok in academic assessments).

Comment author: mutterc 05 April 2011 03:14:42PM *  1 point [-]

Job-hunting fits very well with the model in "Don't Fear Failure": the downside risk is zero. The worst case is accepting a bad job. Assuming you're a USian, jobs are at-will, so just leave then, and you're no worse off.

As a job-hunter, I've learned to model the probability of getting any one job as infinitesimal, so I don't get too hung up on any one application. Let them do the rejecting.

Comment author: Gray 05 April 2011 04:02:22PM 1 point [-]

Assuming you're a USian, jobs are at-will, so just leave then, and you're no worse off.

Is this true? I was always told that employers look down on a spotty employment history; they are less likely to hire someone whose job history is littered with jobs that have been held for less than a year.

Comment author: mutterc 05 April 2011 08:54:52PM 4 points [-]

Yes. (Their worst-case scenario: You're a "professional plaintiff" who hires on, sues for something or other, gets a (confidential) settlement, and moves on).

They also look down on being in the same job a long time (assumption: lack of motivation to advance, etc.). And they look down on gaps in employment (assumption: you were in prison).

To summarize the summary of the summary, HR reps hate people.

Comment author: NancyLebovitz 05 April 2011 04:00:44PM 1 point [-]

I've heard people claim that leaving a job in less than a year looks very bad on the resume. True?

Comment author: suecochran 09 April 2011 10:30:01PM *  0 points [-]

I have heard that many times over the course of my adult working life. I tend to agree with it mostly, although I doubt that it applies equally to all types of work, and it may have been more true in the past than it is in today's economy and with today's technology. I would think that it could vary wildly between say a position such as "Office Manager" and that of "Newspaper Reporter". The reason(s) for leaving would matter a great deal as well. Leaving a job for a much better job (better pay, more prestige, etc.) is quite different than leaving a job due to personality clash or poor work performance. There also could be a big difference depending upon the values of the employer in charge of doing the hiring. The person(s) with decision-making responsibility might place more emphasis on other traits and accomplishments, and not care terribly much that the employee left a job or jobs after a short time of being employed.

Comment author: Kaj_Sotala 04 April 2011 09:30:48AM 19 points [-]

Long ago, I forget where, I saw a blog post that applied this to writing. It pointed out that if we model the quality of your writing as having a mean X and variance Y, then the only way to hit those unlikely exceptionally good texts is to write a lot. Yes, while doing so you might also come up with the same number of exceptionally bad texts, but nobody forces you to show those to anyone. Plus writing a lot will give you practice, gradually pushing up the mean.

From personal experience, I'd also err on the side of publishing even texts you're not personally all that impressed by. I've noticed that I'm relatively bad at estimating what's going to be popular. Some of my biggest hits have been blog posts I'd never have thought would be popular.

Comment author: Alicorn 04 April 2011 02:59:12PM 16 points [-]

Writing massive amounts of text also helps with self-estimation as a person who can write arbitrary amounts of text. My evolution as a writer started with me going "meh, I can't finish anything, I have all these ideas that I sometimes start but then I lose interest after a few pages". Then I started writing collaboratively with a friend, which was so much fun that I could say, "I like writing, at least of this kind, so much that I was on Utah time in Scotland to stay up and write more, for as much as seventeen hours straight." Then I wrote a finished novel... it was fanfiction, but I already considered worldbuilding my strength and character creation also on said list. Then I did it again. At some point I started being "a writer", who can decide to do things like "write a book" and have books exist as a result of this decision.

Comment author: rhollerith_dot_com 04 April 2011 10:16:11AM *  6 points [-]

Maybe so, but I'm not going to keep watching someone's blog or eir user page here unless eir average quality is quite high.

Comment author: Zvi 06 April 2011 02:45:41PM 5 points [-]

There's a difference between average quality produced and average quality published. Ideally you sit on the stuff that isn't any good.

Comment author: rhollerith_dot_com 06 April 2011 10:52:48PM 1 point [-]

We are talking about writing. Do you really think that most writers who need to improve know which of their writings isn't any good?

Comment author: Zvi 07 April 2011 11:11:59PM 2 points [-]

I have been an aspiring writer of sorts, and wrote articles at least once a week for several years without getting much if any feedback on the quality of my writing as opposed to its content. It is fairly easy for me to look back and see a steady improvement in writing quality. I also usually (not always) have no trouble knowing which of my writing isn't any good, and don't remember it having been otherwise.

I could be deluding myself but I certainly think some of my writing is better and some of it is worse.

Comment author: rhollerith_dot_com 08 April 2011 04:54:25PM 0 points [-]

I see. Thanks.

Comment author: Zetetic 07 April 2011 08:50:25AM *  1 point [-]

Well, I'm not an aspiring writer but as an amateur musician and visual artist I can say that I can generally tell when my works are not as good as I would like them to be, and I can generally guess ahead of time which pieces will have a better reception among my more critical friends.

In addition; I know a couple of aspiring writers and if anything I would say that they are often very self critical and judging by what they have openly shown me against what they have reluctantly shown me I would say that they did indeed have a good sense of what was good and what wasn't.

Based on my experiences, I would say that it is fairly common for artists to have a fairly accurate awareness of their own shortcomings; whether or not they can successfully ascertain a workable procedure for overcoming them is a different issue.

Comment author: Kaj_Sotala 04 April 2011 04:00:40PM *  3 points [-]

If one's average quality is low enough that people don't find it worth becoming regular readers, then one is probably better off practicing a lot anyway.

Comment author: sark 04 April 2011 09:23:37PM 0 points [-]

Yes but they assess your blog mostly on its most recent posts. So you should just be out with it and improve anyway. This way you'll always have the best audience your skills can currently get you.

Comment author: rhollerith_dot_com 05 April 2011 12:29:53AM *  4 points [-]

Hmm. Almost all the blogs I continue to follow update infrequently. I always assumed that that was because the bloggers had some way of telling which of their posts or posts-in-planning are really good and had a policy of only posting those, rather than posting every writing exercise they undertake.

Comment author: sark 05 April 2011 12:58:39AM 5 points [-]

Hmm, also, writing can be an excuse for generating ideas! The obvious thing to do would be to wait for good ideas to come into your head, then write them up for the world to see. But in my experience, writing and having my writing read boosts my ego, which somehow encourages my subconscious to throw up ideas which can be written up to derive yet more ego boosts. It's a virtuous cycle. Which makes not writing because you can't think up any good ideas a vicious one.

Comment author: Kaj_Sotala 05 April 2011 07:37:28AM 3 points [-]

This is very true. Not only does writing help me come up with ideas of unrelated posts, writing a post often gives me ideas for further related posts as I think about the issue more.

I've seen at least one professional writer refer to creativity as a muscle - it gets stronger the more you use it. This seems right in my experience.

Comment author: Zetetic 07 April 2011 11:18:05PM 1 point [-]

I strongly agree with this sentiment. I keep a folder for this very purpose; whenever an interesting thought comes to mind I type it up along with as many of the related strands of thought as I can as quickly as possible and then save it to the folder and move on. I've found that this is a fairly useful procedure for organizing my thoughts and documenting my progress in my various areas of interest.

Comment author: Mark_Neznansky 09 April 2011 08:11:01AM 0 points [-]

Wouldn't doing that (instead of writing up the whole argument in a full text) make you feel as if you've already achieved the materialization of the idea, hence reducing your motivation to write it in the future (which might lead to never actually writing the text)?

Comment author: Zetetic 10 April 2011 05:37:35PM *  1 point [-]

I'm only talking about rough sketches, very short, maybe three or four paragraphs. The material in and of itself is not something I would even want use in the future. Think of it as an artist's sketch pad; I practice (1) the purely technical aspect of my writing (2) my ability to quickly convey ideas (3) I consolidate information that was previously floating about in my skull into a nice package by anchoring it to a single event.

It's much the same with guitar or visual art (a least for myself): I may work creatively on one technique by using it to write some nice riffs on guitar or I may try to consolidate the technique of pointillism into my repertoire by using it to draw a face or a land scape. The outcomes of each of these mini-studies is not an end in itself, but rather a stepping stone to mastery.

Comment author: taryneast 04 April 2011 01:18:22PM 3 points [-]

Have you tried NaNoWriMo ?

It's really good for putting this into practice :)

Comment author: Kaj_Sotala 04 April 2011 03:29:53PM 3 points [-]

Won it twice, though those "stories" were pretty solid garbage and I suspect it taught me a bunch of bad habits. It was good for showing that I can do things if I put in enough effort in them, though.

Comment author: taryneast 05 April 2011 08:56:35AM 2 points [-]

Yeah - but the idea isn't to write something brilliant - it's to get into the practice of writing every day. If you won - then you did that perfectly :)

Also - if you really are interested in taking it further, read Steven King's "On writing" in which he points out that every first draft is terrible. You make a real book after the first draft is over. In fact - if you haven't already, you night be interested in picking up the NaNoWriMo handbook "No plot, No problem!" - all made a lot more sense after I read that.

Comment author: Kaj_Sotala 05 April 2011 09:59:41AM *  3 points [-]

Thanks. Maybe I should look into those.

Though it should be noted that while "every first draft is terrible" may be correct for Stephen King, it's not necessarily correct for everyone. There are writers who only do minor revisions to their first draft, while others do several drafts before it gets great.

Comment author: taryneast 05 April 2011 10:15:43AM 2 points [-]

That's true - though I think it'd be safe to say that "every first draft is terrible" is something you could say about the vast majority of writers... for sure there are mozarts in the world of writing - but I'd be very surprised if there were many.

Comment author: Kaj_Sotala 05 April 2011 12:48:16PM *  1 point [-]

Most of my short stories tend to be first drafts, not counting minor edits like changing individual words or making occasional refinements to sentence structure. But then my short stories really are pretty short, so I don't know to what degree this will generalize to novels yet.

As for my non-fiction books, the process has usually been such that the concept of a first draft isn't really valid. I don't write them straight through and then refine, instead I'll write parts of one chapter and then another in non-linear order, then revise some of what I've already written, toss out parts of the book to be replaced with something better, and so forth. By the time I finally have a "first draft" of the whole book, large parts of the content have already been edited several times.

Comment author: taryneast 05 April 2011 12:57:50PM 0 points [-]

From my (limited) experience, novels are very different to short stories. You can hold the whole concept of a short story in your head at one time.. but novels are big and slippery. Things change and develop as you write - so you often have to go back and rewrite or even delete huge swathes of things to fit the new pattern and flow. There are some people who can do without this - but IMO they're either brilliant writing geniuses (to whom we should not compare ourselves) or they're incredibly experienced writers who started out by going through the major-overhaul process... but who have now honed their talent so much they no longer need it... leaving us still in need, generally, of rewriting.

This is not to say that maybe you're an exception - or maybe you're much better at planning (and sticking to the plan) than me :)

Comment author: tristanhaze 29 January 2012 05:17:17AM 0 points [-]

Come to think of it, "every book is terrible" may also be correct for Steven King.

Comment author: Eliezer_Yudkowsky 04 April 2011 01:18:54PM 1 point [-]

But that's just wrong. If you're doing it right your mean creeps steadily upward and that's how you hit high points.

Comment author: wedrifid 04 April 2011 01:59:40PM *  3 points [-]

Plus writing a lot will give you practice, gradually pushing up the mean.

But that's just wrong. If you're doing it right your mean creeps steadily upward and that's how you hit high points.

Not all that wrong it would seem.

Comment author: Eliezer_Yudkowsky 04 April 2011 04:20:43PM -2 points [-]

The "plus" is right, the main idea is wrong.

Comment author: wedrifid 05 April 2011 02:06:24AM *  9 points [-]

Personally I would have put the main idea as the 'plus'. Perhaps overstated but clearly not wrong.

If the quality of works is distributed around a mean then more works you produce the more likely it is for a high quality work to emerge. The most remarkable works will come from the very best authors when they are having a really good day (or month or year). Producing more from the same distribution will obviously give more chances for you to produce something that is outstanding.

On a related note a 'one hit wonder' can be said to be regressing to his mean when his other works flop.

Not 'just wrong'. It's just obvious and less important overall than the training effect.

Comment author: Steven_Bukal 21 May 2011 03:29:15PM 3 points [-]

Paul Graham said something very similar about figuring out a program:

"I was taught in college that one ought to figure out a program completely on paper before even going near a computer. I found that I did not program this way. I found that I liked to program sitting in front of a computer, not a piece of paper. Worse still, instead of patiently writing out a complete program and assuring myself it was correct, I tended to just spew out code that was hopelessly broken, and gradually beat it into shape. Debugging, I was taught, was a kind of final pass where you caught typos and oversights. The way I worked, it seemed like programming consisted of debugging.

For a long time I felt bad about this, just as I once felt bad that I didn't hold my pencil the way they taught me to in elementary school. If I had only looked over at the other makers, the painters or the architects, I would have realized that there was a name for what I was doing: sketching. As far as I can tell, the way they taught me to program in college was all wrong. You should figure out programs as you're writing them, just as writers and painters and architects do."

As a CS student currently at university, my experience has been identical. I also can't help but notice a similarity between these ideas and the methods of agile software development, where at each iteration you produce a working approximation of the final software, taking into account new information and new specification changes as you go.

Comment author: PhilGoetz 21 September 2011 12:27:52AM *  3 points [-]

The people who taught you to architect programs before coding were also aware of this trade-off.

It's a lot easier to write the small programs assigned in college that way, than the large programs you will write in the real world.

This is not the top-down vs. bottom-up debate; both top-down and bottom-up design architect first.

It is related to the concept of waterfall/iterative/incremental design; incremental designers can paint themselves into a corner.

I've written a lot of big programs, and I've never regretted the time spent architecting them. I have sometimes wished I had spent more time designing them before starting.

Fatal design errors that crop up down the road are more likely to be language-related: Your program gets so complicated that you have to rewrite it in a compiled language to run it in real-time or to avoid running out of RAM; or you discover that C++ templates don't work as advertised, or that Java can't allocate 2G of RAM, after you've already writtten 3000 lines.

Comment author: deepthoughtlife 07 April 2011 06:58:59AM 2 points [-]

So far as I can tell, the real issue in telling someone that the only important thing is quality, is that it leads to a phenomenon known in some circles as "paralysis by analysis." For instance, a writer could spend a day debating whether or not a particular place needed a comma or not, and miss that the whole page is rubbish. In sports, it is often what is meant when someone is accused of "thinking too much." In football, a receiver might spend his time thinking about how to get around a defender once he has the ball, and forget to catch the ball.

Like Jeff Atwood, I am a programmer. Unlike Jeff Atwood, I do not have a Wikipedia entry -rightfully so. Also, unlike Jeff, I'm pretty new: unseasoned. So, unlike Jeff Atwood, I still remember the process of learning how to be a programmer.

As far as I can tell, this entry fits with my experiences so far in improving myself as a programmer. I didn't get better at it by theorizing about how to make a beautiful program; in fact, when I tried, I found out the basic truth every good programmer knows; "If you're just barely smart enough to write it, you are, by definition, not smart enough to debug it." I spent weeks thinking about it, getting nowhere, before I used a brute force technique to fix the trouble spot within hours, and still ended up with a pretty nice program.

I must take issue with what Jeff Atwood wrote though. The vast majority of time in a nontrivial program is spent thinking, whether beforehand, or while you're trying to parse the kludge that steadfastly refuses to work. The kludge, insoluble mass that it is, can be immensely harder to fix than replace, but the natural mechanism is always to fix. It isn't immediately obvious, but the solution you have written has an immense hold on your mind, especially since the actual act of entering it took considerable time and effort, so some due diligence in the initial decision is highly warranted.

Many programmers love iteration, which would be analogically described as follows. Take a lump of clay that is to be a statute of a man. Make it the size of the man. First iteration complete. Form arms and legs and a head roughly where they go. Iteration two complete. Carefully delineate the broad details, such as knees, elbows wrists necks, ankles, shape of torso. Iteration three complete. Make clear the placement of medium details, such as fingers, toes, ears eyes, mouth, nose. Iteration four complete. Add the fine details, roughly. Delineate joints, add nails, add scars, add hair. Iteration five complete. Make the joints , and nails, and scars, and hair, and all the other little details just about right. Iteration six complete. Decide what to improve and do so. Iteration seven complete. Check for errors. Fix them. Repeat until done. Iteration eight complete.

The analogy is actually pretty close. Only problem? Each iteration listed above could, and usually would, actually involve several iterations, and testing steps.

The other major solution is far more straightforward. Make the clay look like a man. Step one complete. Is it correct? If no, repeat. Done.

The second way has a much greater quantity of results, because it is a simpler, and quicker, way to make a figure of a man. The superiority of the iterative approach comes in when it is clear you will not get it right in any one try. If I make no mistakes, I may take 24 steps in an iterative approach where I would take 2 in the one go approach. The steps are, however, not equal in length (the iterative steps are much shorter). Let us make it 8 to 2.

This still looks like a slam dunk for the all at once approach. With errors, it stays that way with a low quality standard. If any of the first three are right for it, all at once is still better. Once an extremely high quality standard for the end result is used, however, it is quite likely that not one of the first 25 all at once clay men will be good enough. Even with a high standard, iteration is likely to not need an amplification of more than 2 or 3. 50 versus 24 now makes it a slam dunk in favor of iteration.

In the end, both methods actually would give about the same amount of experience, (though I don't have space to justify that here, and good arguments against it could be made) almost regardless of the number of steps necessary. Somewhere in the middle must be a crossover point, between iteration and doing it all at once. It behooves us to figure out where we are in relation to it.

The long-winded point here is that quantity (iteration) can produce high quality quicker than quality (get it right straight up), but only some of the time. A low quality standard is analogous to your low cost of failure, whereas the high cost of failure is likewise to the high quality standard. For beginners, the standard is low, and just doing it is probably the best way (though they still need to understand in order to make a real attempt).

For pure personal learning, among the experienced, it is far trickier to be sure. True failure is highly costly, as you learn the wrong thing, but it also is less likely, and minor failures can be learned from.

I'm relatively sure Jeff Atwood understands all of this, of course, but it isn't immediately obvious from his writing here. I'm not some guru, and this isn't diving wisdom wither, but it is always a good idea to keep in mind what is so basic, that the expert has forgotten to mention it exists. After all, he is here to push for the deviation, not simply the status quo.

Comment author: Morendil 04 April 2011 12:17:51PM *  2 points [-]

See also: NaNoWriMo.

Comment author: CronoDAS 04 April 2011 02:02:58AM 1 point [-]

Did you write the title backwards?

Comment author: atucker 04 April 2011 02:18:41AM 0 points [-]

Not intentionally, but it certainly looks better this way.

Comment author: Mark_Neznansky 10 April 2011 06:00:35AM *  1 point [-]

I wish to expand on your conclusions and look for their limits. It might be more relevant to the "Go Try Things" post, but it being a kind of series of posts, I suppose it makes sense most to comment here.

So, data collection is good. But aside of getting one better at some area in which one tries to reach expertise or improvement, data collection is also good for discovering almost totally new facets of reality, territory that is outside the map's margins.

Data collection bring to light not only known unknowns, but unknown unknowns too. There's a risk involved, however. It seems that for the most part, the opportunity cost of researching unknown unknowns is greater than researching known unknowns: When practicing anything, the costs and possible benefits are pretty known. You know what you have to do to get better at playing an instrument, build better robots, programming or dancing tango. You also pretty much know what are the fruits of that labor (though perhaps not entirely, especially when it is many "quantum steps" away in terms of skill expertise).

On the other hand, when you consider whether to delve into some new unknown territory, you’re less familiar with the costs (you don’t know whether you’ll enjoy “uncovering data” or not, for example) as well as with the possible utility. Let's say some person A is invited to a salsa dancing party or class. He considers the idea but decides not to go. He thinks how It will obviously take a few hours which he could invest in more familiar activities that yield more utility than dancing; it will probably have some social costs involved, as in any new endeavor which is unfamiliar and especially one involves the moving of one’s body; even if he will enjoy it he doesn’t think he’ll have the time to invest on more such occasions, and he doesn’t think doing it once will be very useful, etc. etc. etc.

However, what if this person is unaware that salsa, if he were to try it out, will greatly benefit him? Elevate his spirit, exercise his body and provide some new kind of social interactions which will benefit him on non-dancing social occasions, and that if he decided to fully incorporate it in his life, it would provide excellent rest from his usual activity (say, his profession) and even benefit it in other ways?

So it must be benefitial to also collect data outside of the map, to explore new frontiers and horizons. But there must be a limit to this. The great many activities the world provides can probably fill a few life-times of human beings (or maybe not?). But either way, there must be some point where more exploration is actually adverse in its effects, if no activity is being engaged more than superficially. So how can one decide whether to embark on exploration or not?

Of course, there is meta-data available on activities. There is some text on the internet for probably most tried-out activities out there, friends share their experience with things they’ve done, movies and books tell us about activities unknown to us, and so on. But would such data actually help a person decide whether to engage in an activity or not, is it overwhelming enough to “change his mind” from not-doing the activity to doing it? My guess is not. Most people (as noted on “Hold off proposing solutions”) probably decide if they want to engage in the activity upon first hearing about the opportunity to engage in it, and, more than that, I suspect that their decision is based less upon the nature of the activity and more upon the nature of the “activists”, the people who are commonly engaged in that activity. Many activities produce some kind of culture around them, which hardy can be ignored. Since for an activity to exists it needs to be done, and if it is being done then someone must be doing it, so to imagine that activity one must imagine someone doing that activity, or imagine oneself as the kind of person who does that (of course, if it is taken more seriously, one can imagine the activity more “naturally”, ignoring the nature of other people who engage in that).

To actually decide whether to engage in some new activity, one needs to take the decision seriously. But then, to avoid such “paralysis analysis”, it would probably be easier just to start “doing it” instead of thinking about it (with the exception of activities with really high costs such as exploring the south pole or conceiving a child). But then again, there must be a limit to the amount of “new things” a person can do. Some people are likely (have high probability) to greatly beefit from exploration, while others are unlikely to benefit from it. How can one recognize which one she/he is?

What do you think?

Comment author: jwhendy 05 April 2011 08:01:22PM *  1 point [-]

You should hold off on proposing solutions, but you also need to get around to actually trying the proposed solution.

I can almost swear that I was reading a post in the sequences where a statement was made to the effect that "If you already know where you think you'll end up on a decision/analysis/estimation, you should just head in that direction."

Does this ring any bells? I'd like to re-find that and have googled for it many, many ways and never found it again. It seems similar to the idea here, and thus I thought I'd ask for help here in finding it -- I really would like to read it again!

Comment author: Sniffnoy 12 April 2011 02:45:44AM *  2 points [-]
Comment author: jwhendy 01 September 2011 03:16:24PM 0 points [-]

Wow -- I totally missed this comment! I just re-round this same thread, then tried to track back to this comment to let everyone know what the answer actually was... and found your answer pointing me there all along. Crazy. Thanks for the link.

Comment author: RichardKennaway 05 April 2011 08:44:25PM *  2 points [-]

Once you know your destination, you are already there.

In the Way of Bayes, the prior probability equals the expected posterior probability: If you know your destination, you are already there.

You should not think that you know which direction [your belief] will go [after getting evidence], because by the laws of probability theory, if you know your destination, you are already there.

Those all have to do with belief and evidence. I'm sure Eliezer has said this about morality as well (if you want something to be the right action, you already think it is the right action), but I haven't tracked that down. There's this, but no pithy quote.

Comment author: jwhendy 05 April 2011 09:31:18PM *  1 point [-]

Thanks for the links, but those aren't it. I'm familiar with that recurring tidbit, but the post I am thinking of, I believe, contained more of an encouraged heuristic; something like, "If you're finnagling over a decision and think you'll end up on one side, at a certain point, you should just get it over with and actually go to that side."

As I was thinking about this, I thought it might have been in The Proper Use of Doubt. While it's actually close, it's not what I remember. The quote I'm looking for would have embodied something kind of like this:

Eventually the cost of searching will exceed the expected benefit, and you'll stop searching. At which point you can no longer claim to be usefully doubting. A doubt that is not investigated might as well not exist.

Maybe I'm just making things up in my head!

Comment author: shokwave 07 April 2011 09:47:34AM 0 points [-]

No, I remember it too. It was something about a survey of students; of the group that could say they hadn't yet decided but would probably end up doing course A, almost every student chose course A. So the message was something like if you can guess where you're probably going to head, actually you've pretty much made the decision.

Comment author: jwhendy 07 April 2011 02:42:28PM 0 points [-]

Well, that does sound a lot like what Richard came up with above:

Once you know your destination, you are already there.

But... I take that comment to be somewhat of a negative statement. As in, you shouldn't know your destination, otherwise all of this rationality stuff is pointless. The phrase I recall (and perhaps what you're suggesting as well) was almost more of an encouragement -- like, "If you're endlessly deliberating, it will be more useful for you to head in the direction you're leaning vs. continuing to deliberate as if you don't know at all."

Comment author: Wilka 11 April 2011 04:04:38PM *  0 points [-]

Maybe it was "Once you can guess what your answer will be, you have probably already decided." from Hold Off On Proposing Solutions

Comment author: Pavitra 04 April 2011 08:54:51PM 0 points [-]

The scoring rule for group B is critically underspecified -- are they being graded on the best pot they produce? The worst? The mean? The median?

You should hold off on proposing solutions, but you also need to get around to actually trying the proposed solution.

I was under the impression that the whole point of holding off on proposing solutions was to make sure you came up with more than one idea.

Comment author: ksolez 04 April 2011 05:50:04PM 0 points [-]

As a general principle the right answer depends on the consequence of low quality and the balance of creativity versus practice. In medicine where the consequence of low quality (of a surgical operation for instance) may be death of the patient, there still is value to quantity and perfecting technique by frequent practice. Therefore simulations play a role. Ideally the surgeon makes use of much tacit knowledge in a successful operation, many instinctual elements that cannot be put into words. In many tasks like surgery, you want only intermittent creativity. Most efforts go into getting better and better at a task already mastered conceptually, not into improving the concept.

Comment author: Liron 04 April 2011 02:45:39AM 0 points [-]

Wow, what a cool experiment.

Comment author: Desrtopa 04 April 2011 03:08:43AM 3 points [-]

I'm surprised he was allowed to do it; I don't think many institutions will let professors apply inconsistent grading criteria like that.

Comment author: Will_Sawin 04 April 2011 03:11:26AM 10 points [-]

Seems so surprising I would guess that it was apocryphal, not knowing the context.

Comment author: Desrtopa 04 April 2011 04:33:39AM 9 points [-]

I did a further search online; the quote appears verbatim in many places, but I can't find any additional information which would single out a school or teacher, so I would err on the side of assuming it's fictional.

Comment author: atucker 04 April 2011 10:51:45AM *  8 points [-]

Now I feel dirty.

Apparently it comes from this book. Unfortunately artists don't seem to do anything like peer review, so if it really did happen I wouldn't expect it to be shown any more verifiably than this. On the other hand, I don't particularly trust artists not to make up a story and call it an experiment.

Comment author: paper-machine 04 April 2011 03:16:03AM 4 points [-]

One really shouldn't give probably apocryphal evidence like this in support of a theory.

Comment author: Eliezer_Yudkowsky 04 April 2011 05:47:01AM 7 points [-]

Actually I think the next step is to run the experiment. It's a perfectly good experiment - even though I'd expect it not to happen inside a real school, because the students would complain they weren't being graded fairly.

Concept is still important enough that I promoted it immediately.

Comment author: Giles 04 April 2011 03:26:42PM 7 points [-]

I would expect it not to happen inside a real school because the school would be uninterested in improving their teaching via the experimental method. Of course I hope I'm wrong! (Anyone know of particularly enlightened schools?)

If I'm right it would seem a much harder problem to solve than the being-graded-fairly one.

Comment author: atucker 05 April 2011 03:02:22AM 4 points [-]

Someone could just book a ceramics studio and advertise free lessons/use in exchange for participating in an experiment.

Then split the people there into two groups, and tell one group that they get money (or other arbitrary desirable) based on how many pots they make, and the other on how good their pots are.

Comment author: thomblake 11 April 2011 04:20:30PM 2 points [-]

My wife intends to start teaching sculpture in a year or so... I'll make sure to bring this up to her. The tricky part will be objectively grading quality (an outside judge and blindness should be sufficient).

Comment author: twanvl 04 April 2011 12:01:41PM 0 points [-]

Maybe it could be done across different schools, different classes or different years. For example, in year 1 teach subject focusing on quality and subject 2 focusing on quantity. Then in year 2 reverse the roles. But then you also need to be careful with the order of the subjects.

Just splitting the students into two groups would be better though, aside from the complaints. This is a problem with A/B testing in general: people want to be treated fairly. Are there good ways to reduce (the risk of) such complaints?

Comment author: 7598462153 04 April 2011 02:44:32PM 1 point [-]

To me, the simplest solution that comes to mind is to grade on a curve at the end of the course, based on the quality of the work (or some other subjective measure - I'm not familiar with university art courses, but I assume there's some kind of widely-accepted grading methodology). That is to say - tell the students before the course they will be graded on quality or quantity depending on the group, but grade each student on a curve relative to their own section for their permanent grade once the course has finished. This would obviously still require lying to the students (at the very least by omission, arguably), however.

Comment author: [deleted] 05 April 2011 01:32:46AM 0 points [-]

I'm a bit incredulous about the experiment. The students' quantity of production can always be maximized by decreasing the quality. Why wouldn't they? (Even if the students maintain quality, how could the teacher justify assuming this in advance.)

Evaluating the study formally as an experiment, attention to quality is confounded with spending time on the work. (Reiteration is a different story, but in this case we seem to be talking about moving on to a different project, rather than perfecting a single design.) Did the students evaluated on quality produce lower quality because they spent time thinking rather than working; or because they spent too much time on a single item, trying to perfect it?

In writing, it's important to write rather than plan to write, but it doesn't follow that it's important to produce a great number of products rather than ones of high quality. From personal experience, one's writing improves by producing polished work, not by producing an abundance of it.

Comment author: taryneast 05 April 2011 04:04:33PM 3 points [-]

I agree - mere quantity is not enough... but the act of creating quantity gives a person the chance to gain the experience needed to learn how to improve the quality.

Very few people can get to Quality without going through the quantity...

Quantity is a necessary, but not sufficient condition of quality.